Dec 06 15:28:51 crc systemd[1]: Starting Kubernetes Kubelet... Dec 06 15:28:51 crc restorecon[4681]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 15:28:51 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:52 crc restorecon[4681]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 15:28:52 crc restorecon[4681]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 06 15:28:52 crc kubenswrapper[4848]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 15:28:52 crc kubenswrapper[4848]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 06 15:28:52 crc kubenswrapper[4848]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 15:28:52 crc kubenswrapper[4848]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 15:28:52 crc kubenswrapper[4848]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 06 15:28:52 crc kubenswrapper[4848]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.837734 4848 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840857 4848 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840875 4848 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840879 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840883 4848 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840887 4848 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840891 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840895 4848 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840899 4848 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840903 4848 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840908 4848 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840912 4848 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840916 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840919 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840931 4848 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840935 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840939 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840944 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840949 4848 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840953 4848 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840958 4848 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840962 4848 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840967 4848 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840971 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840976 4848 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840981 4848 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840985 4848 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840991 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.840995 4848 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841000 4848 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841004 4848 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841010 4848 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841016 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841023 4848 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841028 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841032 4848 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841036 4848 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841040 4848 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841043 4848 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841048 4848 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841052 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841057 4848 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841061 4848 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841066 4848 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841069 4848 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841073 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841076 4848 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841080 4848 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841084 4848 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841087 4848 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841094 4848 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841097 4848 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841101 4848 feature_gate.go:330] unrecognized feature gate: Example Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841104 4848 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841108 4848 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841111 4848 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841114 4848 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841118 4848 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841121 4848 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841125 4848 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841128 4848 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841131 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841135 4848 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841140 4848 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841143 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841147 4848 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841152 4848 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841156 4848 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841159 4848 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841163 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841167 4848 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.841170 4848 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841370 4848 flags.go:64] FLAG: --address="0.0.0.0" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841380 4848 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841388 4848 flags.go:64] FLAG: --anonymous-auth="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841395 4848 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841400 4848 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841405 4848 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841410 4848 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841415 4848 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841419 4848 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841423 4848 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841427 4848 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841432 4848 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841437 4848 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841441 4848 flags.go:64] FLAG: --cgroup-root="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841445 4848 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841449 4848 flags.go:64] FLAG: --client-ca-file="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841453 4848 flags.go:64] FLAG: --cloud-config="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841457 4848 flags.go:64] FLAG: --cloud-provider="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841461 4848 flags.go:64] FLAG: --cluster-dns="[]" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841466 4848 flags.go:64] FLAG: --cluster-domain="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841470 4848 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841474 4848 flags.go:64] FLAG: --config-dir="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841478 4848 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841482 4848 flags.go:64] FLAG: --container-log-max-files="5" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841488 4848 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841492 4848 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841496 4848 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841500 4848 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841505 4848 flags.go:64] FLAG: --contention-profiling="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841509 4848 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841513 4848 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841517 4848 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841521 4848 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841526 4848 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841530 4848 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841535 4848 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841539 4848 flags.go:64] FLAG: --enable-load-reader="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841543 4848 flags.go:64] FLAG: --enable-server="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841547 4848 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841553 4848 flags.go:64] FLAG: --event-burst="100" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841557 4848 flags.go:64] FLAG: --event-qps="50" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841561 4848 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841565 4848 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841569 4848 flags.go:64] FLAG: --eviction-hard="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841574 4848 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841579 4848 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841583 4848 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841587 4848 flags.go:64] FLAG: --eviction-soft="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841591 4848 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841594 4848 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841599 4848 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841603 4848 flags.go:64] FLAG: --experimental-mounter-path="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841607 4848 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841611 4848 flags.go:64] FLAG: --fail-swap-on="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841615 4848 flags.go:64] FLAG: --feature-gates="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841620 4848 flags.go:64] FLAG: --file-check-frequency="20s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841624 4848 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841628 4848 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841632 4848 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841637 4848 flags.go:64] FLAG: --healthz-port="10248" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841641 4848 flags.go:64] FLAG: --help="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841645 4848 flags.go:64] FLAG: --hostname-override="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841649 4848 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841653 4848 flags.go:64] FLAG: --http-check-frequency="20s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841657 4848 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841661 4848 flags.go:64] FLAG: --image-credential-provider-config="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841666 4848 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841670 4848 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841674 4848 flags.go:64] FLAG: --image-service-endpoint="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841678 4848 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841684 4848 flags.go:64] FLAG: --kube-api-burst="100" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841689 4848 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841709 4848 flags.go:64] FLAG: --kube-api-qps="50" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841715 4848 flags.go:64] FLAG: --kube-reserved="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841720 4848 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841725 4848 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841731 4848 flags.go:64] FLAG: --kubelet-cgroups="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841736 4848 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841741 4848 flags.go:64] FLAG: --lock-file="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841745 4848 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841750 4848 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841755 4848 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841763 4848 flags.go:64] FLAG: --log-json-split-stream="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841768 4848 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841773 4848 flags.go:64] FLAG: --log-text-split-stream="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841778 4848 flags.go:64] FLAG: --logging-format="text" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841783 4848 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841788 4848 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841792 4848 flags.go:64] FLAG: --manifest-url="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841797 4848 flags.go:64] FLAG: --manifest-url-header="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841803 4848 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841807 4848 flags.go:64] FLAG: --max-open-files="1000000" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841812 4848 flags.go:64] FLAG: --max-pods="110" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841816 4848 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841820 4848 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841824 4848 flags.go:64] FLAG: --memory-manager-policy="None" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841829 4848 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841834 4848 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841839 4848 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841843 4848 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841855 4848 flags.go:64] FLAG: --node-status-max-images="50" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841860 4848 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841864 4848 flags.go:64] FLAG: --oom-score-adj="-999" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841868 4848 flags.go:64] FLAG: --pod-cidr="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841872 4848 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841878 4848 flags.go:64] FLAG: --pod-manifest-path="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841882 4848 flags.go:64] FLAG: --pod-max-pids="-1" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841886 4848 flags.go:64] FLAG: --pods-per-core="0" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841891 4848 flags.go:64] FLAG: --port="10250" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841895 4848 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841899 4848 flags.go:64] FLAG: --provider-id="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841903 4848 flags.go:64] FLAG: --qos-reserved="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841907 4848 flags.go:64] FLAG: --read-only-port="10255" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841911 4848 flags.go:64] FLAG: --register-node="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841915 4848 flags.go:64] FLAG: --register-schedulable="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841919 4848 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841925 4848 flags.go:64] FLAG: --registry-burst="10" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841930 4848 flags.go:64] FLAG: --registry-qps="5" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841933 4848 flags.go:64] FLAG: --reserved-cpus="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841937 4848 flags.go:64] FLAG: --reserved-memory="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841942 4848 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841946 4848 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841951 4848 flags.go:64] FLAG: --rotate-certificates="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841955 4848 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841959 4848 flags.go:64] FLAG: --runonce="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841963 4848 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841967 4848 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841971 4848 flags.go:64] FLAG: --seccomp-default="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841975 4848 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841980 4848 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841984 4848 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841988 4848 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841992 4848 flags.go:64] FLAG: --storage-driver-password="root" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.841996 4848 flags.go:64] FLAG: --storage-driver-secure="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842000 4848 flags.go:64] FLAG: --storage-driver-table="stats" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842004 4848 flags.go:64] FLAG: --storage-driver-user="root" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842008 4848 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842012 4848 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842016 4848 flags.go:64] FLAG: --system-cgroups="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842021 4848 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842027 4848 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842030 4848 flags.go:64] FLAG: --tls-cert-file="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842034 4848 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842039 4848 flags.go:64] FLAG: --tls-min-version="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842043 4848 flags.go:64] FLAG: --tls-private-key-file="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842047 4848 flags.go:64] FLAG: --topology-manager-policy="none" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842051 4848 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842055 4848 flags.go:64] FLAG: --topology-manager-scope="container" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842059 4848 flags.go:64] FLAG: --v="2" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842065 4848 flags.go:64] FLAG: --version="false" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842070 4848 flags.go:64] FLAG: --vmodule="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842075 4848 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842080 4848 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842185 4848 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842190 4848 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842194 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842198 4848 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842202 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842206 4848 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842211 4848 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842214 4848 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842224 4848 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842228 4848 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842233 4848 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842236 4848 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842240 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842243 4848 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842247 4848 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842250 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842254 4848 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842258 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842261 4848 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842264 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842268 4848 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842271 4848 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842275 4848 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842279 4848 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842282 4848 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842286 4848 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842289 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842293 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842296 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842300 4848 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842304 4848 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842308 4848 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842312 4848 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842315 4848 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842319 4848 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842323 4848 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842326 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842329 4848 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842333 4848 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842337 4848 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842342 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842346 4848 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842349 4848 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842353 4848 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842357 4848 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842361 4848 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842366 4848 feature_gate.go:330] unrecognized feature gate: Example Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842369 4848 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842373 4848 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842376 4848 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842380 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842383 4848 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842387 4848 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842390 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842394 4848 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842397 4848 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842400 4848 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842404 4848 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842407 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842411 4848 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842414 4848 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842417 4848 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842421 4848 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842425 4848 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842430 4848 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842433 4848 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842437 4848 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842442 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842446 4848 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842450 4848 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.842453 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.842465 4848 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.850851 4848 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.850889 4848 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.850983 4848 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.850993 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.850999 4848 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851004 4848 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851010 4848 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851016 4848 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851021 4848 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851025 4848 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851030 4848 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851035 4848 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851039 4848 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851043 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851048 4848 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851052 4848 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851061 4848 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851070 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851075 4848 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851080 4848 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851085 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851089 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851094 4848 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851099 4848 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851104 4848 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851108 4848 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851112 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851117 4848 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851122 4848 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851127 4848 feature_gate.go:330] unrecognized feature gate: Example Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851131 4848 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851135 4848 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851140 4848 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851146 4848 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851152 4848 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851158 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851164 4848 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851170 4848 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851176 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851181 4848 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851185 4848 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851189 4848 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851194 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851201 4848 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851206 4848 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851211 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851216 4848 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851220 4848 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851225 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851229 4848 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851234 4848 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851239 4848 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851244 4848 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851249 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851254 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851259 4848 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851264 4848 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851269 4848 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851274 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851279 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851284 4848 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851288 4848 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851293 4848 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851297 4848 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851302 4848 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851307 4848 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851312 4848 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851316 4848 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851321 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851326 4848 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851331 4848 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851336 4848 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851341 4848 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.851350 4848 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851500 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851512 4848 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851518 4848 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851525 4848 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851532 4848 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851538 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851577 4848 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851585 4848 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851590 4848 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851597 4848 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851602 4848 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851608 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851613 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851618 4848 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851622 4848 feature_gate.go:330] unrecognized feature gate: Example Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851627 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851632 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851637 4848 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851642 4848 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851646 4848 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851651 4848 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851656 4848 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851661 4848 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851665 4848 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851670 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851675 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851680 4848 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851684 4848 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851689 4848 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851710 4848 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851715 4848 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851721 4848 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851725 4848 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851730 4848 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851735 4848 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851739 4848 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851745 4848 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851752 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851757 4848 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851761 4848 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851766 4848 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851771 4848 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851776 4848 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851781 4848 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851786 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851791 4848 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851795 4848 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851800 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851805 4848 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851810 4848 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851815 4848 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851820 4848 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851825 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851830 4848 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851835 4848 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851839 4848 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851844 4848 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851849 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851854 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851859 4848 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851865 4848 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851870 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851878 4848 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851884 4848 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851890 4848 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851895 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851901 4848 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851907 4848 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851913 4848 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851919 4848 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.851925 4848 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.851932 4848 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.852352 4848 server.go:940] "Client rotation is on, will bootstrap in background" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.855607 4848 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.855729 4848 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.856198 4848 server.go:997] "Starting client certificate rotation" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.856225 4848 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.856393 4848 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-07 04:00:32.611480926 +0000 UTC Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.856468 4848 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 756h31m39.75501542s for next certificate rotation Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.865346 4848 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.866963 4848 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.873850 4848 log.go:25] "Validated CRI v1 runtime API" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.885006 4848 log.go:25] "Validated CRI v1 image API" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.886184 4848 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.888034 4848 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-06-15-25-23-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.888066 4848 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.902357 4848 manager.go:217] Machine: {Timestamp:2025-12-06 15:28:52.901436456 +0000 UTC m=+0.199447379 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:fdce5a22-c98f-4909-8c21-e3a12013664f BootID:160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ab:02:81 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ab:02:81 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:7e:68:92 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:cb:c1:d4 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:50:f8:eb Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:11:23:c2 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:62:f4:ff:87:5d:84 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:97:57:93:3b:70 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.902560 4848 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.902786 4848 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.903130 4848 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.903623 4848 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.903656 4848 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.903857 4848 topology_manager.go:138] "Creating topology manager with none policy" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.903866 4848 container_manager_linux.go:303] "Creating device plugin manager" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.904035 4848 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.904079 4848 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.904252 4848 state_mem.go:36] "Initialized new in-memory state store" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.904332 4848 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.905300 4848 kubelet.go:418] "Attempting to sync node with API server" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.905318 4848 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.905338 4848 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.905350 4848 kubelet.go:324] "Adding apiserver pod source" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.905369 4848 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.906772 4848 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.907148 4848 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.907774 4848 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908271 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908302 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908312 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908322 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908335 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908344 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908353 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908366 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.908270 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.908356 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Dec 06 15:28:52 crc kubenswrapper[4848]: E1206 15:28:52.908449 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908378 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 06 15:28:52 crc kubenswrapper[4848]: E1206 15:28:52.908461 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908483 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908498 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908509 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.908987 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.909395 4848 server.go:1280] "Started kubelet" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.909565 4848 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.909597 4848 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.909959 4848 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 06 15:28:52 crc systemd[1]: Started Kubernetes Kubelet. Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.912127 4848 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Dec 06 15:28:52 crc kubenswrapper[4848]: E1206 15:28:52.911995 4848 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.64:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187ea9eb8037ed8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 15:28:52.909370762 +0000 UTC m=+0.207381665,LastTimestamp:2025-12-06 15:28:52.909370762 +0000 UTC m=+0.207381665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.913559 4848 server.go:460] "Adding debug handlers to kubelet server" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.919973 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.920017 4848 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.920815 4848 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.921965 4848 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 06 15:28:52 crc kubenswrapper[4848]: E1206 15:28:52.920827 4848 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.920857 4848 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.920802 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 04:39:02.697119895 +0000 UTC Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.921558 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Dec 06 15:28:52 crc kubenswrapper[4848]: E1206 15:28:52.922482 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Dec 06 15:28:52 crc kubenswrapper[4848]: E1206 15:28:52.921241 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="200ms" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.929302 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.929512 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.929611 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.929683 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.929815 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.929890 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.929970 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930037 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930109 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930180 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930248 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930318 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930392 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930467 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930534 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930600 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930666 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930808 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930886 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.930953 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931019 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931129 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931209 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931318 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931398 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931482 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931548 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931651 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931733 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931808 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931906 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.931983 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932050 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932165 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932240 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932306 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932373 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932464 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932322 4848 factory.go:55] Registering systemd factory Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932605 4848 factory.go:221] Registration of the systemd container factory successfully Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932566 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932753 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932793 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932805 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932816 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932827 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932839 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932850 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932860 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932869 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932879 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932888 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932898 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932908 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932928 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932942 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932953 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932965 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932975 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.932985 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933002 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933012 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933022 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933033 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933043 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933053 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933063 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933073 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933083 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933094 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933104 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933115 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933124 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933133 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933173 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933183 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933197 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933208 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933217 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933228 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933237 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933247 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933257 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933266 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933276 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933286 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933297 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933308 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933318 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933328 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933337 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933347 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933357 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933366 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933375 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933386 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933395 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933407 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933416 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933745 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933756 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933766 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933774 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933786 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933794 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933804 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933819 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933831 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933834 4848 factory.go:153] Registering CRI-O factory Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933854 4848 factory.go:221] Registration of the crio container factory successfully Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933841 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933924 4848 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933928 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933939 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933943 4848 factory.go:103] Registering Raw factory Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933951 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933957 4848 manager.go:1196] Started watching for new ooms in manager Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933962 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933974 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933984 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.933994 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934004 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934014 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934026 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934038 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934050 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934059 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934068 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934077 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934087 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934098 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934110 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934128 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934140 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934151 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934162 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934174 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934183 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934192 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934202 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934216 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934228 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934241 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934252 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934265 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934278 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934291 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934302 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.934483 4848 manager.go:319] Starting recovery of all containers Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935176 4848 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935199 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935220 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935230 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935239 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935248 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935258 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935268 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935277 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935286 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935294 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935303 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935311 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935325 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935334 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935344 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935352 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935361 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935370 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935378 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935387 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935396 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935405 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935415 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935425 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935434 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935442 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935451 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935465 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935476 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935489 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935500 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935513 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935522 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935531 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935540 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935552 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935565 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935576 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935619 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935633 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935641 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935652 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935661 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935674 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935686 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935720 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935734 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935747 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935758 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935767 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935778 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935789 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935802 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935816 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935831 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935846 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935860 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935872 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935885 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935897 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935909 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935922 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935939 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935952 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935967 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935981 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.935992 4848 reconstruct.go:97] "Volume reconstruction finished" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.936000 4848 reconciler.go:26] "Reconciler: start to sync state" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.952938 4848 manager.go:324] Recovery completed Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.962609 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.963815 4848 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.964402 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.964448 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.964462 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.965119 4848 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.965158 4848 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.965186 4848 kubelet.go:2335] "Starting kubelet main sync loop" Dec 06 15:28:52 crc kubenswrapper[4848]: E1206 15:28:52.965239 4848 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.965454 4848 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.965469 4848 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.965485 4848 state_mem.go:36] "Initialized new in-memory state store" Dec 06 15:28:52 crc kubenswrapper[4848]: W1206 15:28:52.966095 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Dec 06 15:28:52 crc kubenswrapper[4848]: E1206 15:28:52.966159 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.974105 4848 policy_none.go:49] "None policy: Start" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.974828 4848 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 06 15:28:52 crc kubenswrapper[4848]: I1206 15:28:52.974851 4848 state_mem.go:35] "Initializing new in-memory state store" Dec 06 15:28:53 crc kubenswrapper[4848]: E1206 15:28:53.022535 4848 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.027850 4848 manager.go:334] "Starting Device Plugin manager" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.027981 4848 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.027999 4848 server.go:79] "Starting device plugin registration server" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.028403 4848 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.028422 4848 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.028641 4848 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.028747 4848 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.028761 4848 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 06 15:28:53 crc kubenswrapper[4848]: E1206 15:28:53.038084 4848 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.066313 4848 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.066403 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.067496 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.067532 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.067541 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.067666 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.067794 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.067827 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.068440 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.068463 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.068471 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.068549 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.068869 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.068894 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.069192 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.069208 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.069216 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.069423 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.069444 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.069452 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.069822 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.069845 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.069865 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.069959 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.070292 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.070316 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.070902 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.070931 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.070942 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.071089 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.071115 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.071125 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.071231 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.071367 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.071429 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.071803 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.071831 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.071842 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.071980 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.072012 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.072093 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.072121 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.072133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.072538 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.072558 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.072567 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: E1206 15:28:53.123232 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="400ms" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.130336 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.131200 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.131226 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.131236 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.131259 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 15:28:53 crc kubenswrapper[4848]: E1206 15:28:53.131528 4848 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.64:6443: connect: connection refused" node="crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.137817 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.137852 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.137875 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.137895 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.137916 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.137934 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.138007 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.138045 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.138072 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.138116 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.138136 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.138160 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.138179 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.138304 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.138341 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239640 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239693 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239730 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239754 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239802 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239824 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239845 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239849 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239881 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239913 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239932 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239938 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239866 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239907 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239963 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.239981 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240006 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240026 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240040 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240047 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240066 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240081 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240102 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240113 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240115 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240136 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240122 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240151 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240164 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.240135 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.332179 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.333280 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.333330 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.333344 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.333376 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 15:28:53 crc kubenswrapper[4848]: E1206 15:28:53.333906 4848 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.64:6443: connect: connection refused" node="crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.393942 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.404626 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: W1206 15:28:53.417641 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-06e49b326c2bd2e4702dc5ccd8171cc553d3a266f80b045522242cbea3d23ac1 WatchSource:0}: Error finding container 06e49b326c2bd2e4702dc5ccd8171cc553d3a266f80b045522242cbea3d23ac1: Status 404 returned error can't find the container with id 06e49b326c2bd2e4702dc5ccd8171cc553d3a266f80b045522242cbea3d23ac1 Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.417909 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: W1206 15:28:53.420436 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-94e1c541b22fa7e083f6d5a6f9c8abbbbb6841361d5635f88e382eefa9325e60 WatchSource:0}: Error finding container 94e1c541b22fa7e083f6d5a6f9c8abbbbb6841361d5635f88e382eefa9325e60: Status 404 returned error can't find the container with id 94e1c541b22fa7e083f6d5a6f9c8abbbbb6841361d5635f88e382eefa9325e60 Dec 06 15:28:53 crc kubenswrapper[4848]: W1206 15:28:53.436158 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-93eca2929d05eab5f81039108b506e3204a30832a58a8d54549a27eb36a5e7c6 WatchSource:0}: Error finding container 93eca2929d05eab5f81039108b506e3204a30832a58a8d54549a27eb36a5e7c6: Status 404 returned error can't find the container with id 93eca2929d05eab5f81039108b506e3204a30832a58a8d54549a27eb36a5e7c6 Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.436144 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.443291 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:53 crc kubenswrapper[4848]: W1206 15:28:53.481545 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-852195ee5d8d0015d698f66968dca8ccacba9364e85e1960af79613aca3db39c WatchSource:0}: Error finding container 852195ee5d8d0015d698f66968dca8ccacba9364e85e1960af79613aca3db39c: Status 404 returned error can't find the container with id 852195ee5d8d0015d698f66968dca8ccacba9364e85e1960af79613aca3db39c Dec 06 15:28:53 crc kubenswrapper[4848]: W1206 15:28:53.493290 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-3c1887d7d9ba7e78dec55acfc58f6e282cd82ea434c11752d81f98a1c8147894 WatchSource:0}: Error finding container 3c1887d7d9ba7e78dec55acfc58f6e282cd82ea434c11752d81f98a1c8147894: Status 404 returned error can't find the container with id 3c1887d7d9ba7e78dec55acfc58f6e282cd82ea434c11752d81f98a1c8147894 Dec 06 15:28:53 crc kubenswrapper[4848]: E1206 15:28:53.523982 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="800ms" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.734617 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.738377 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.738414 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.738426 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.738448 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 15:28:53 crc kubenswrapper[4848]: E1206 15:28:53.738820 4848 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.64:6443: connect: connection refused" node="crc" Dec 06 15:28:53 crc kubenswrapper[4848]: W1206 15:28:53.882962 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Dec 06 15:28:53 crc kubenswrapper[4848]: E1206 15:28:53.883022 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.912884 4848 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.923089 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 08:34:56.573480942 +0000 UTC Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.923127 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 833h6m2.650355947s for next certificate rotation Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.969894 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748" exitCode=0 Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.969958 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748"} Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.970026 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3c1887d7d9ba7e78dec55acfc58f6e282cd82ea434c11752d81f98a1c8147894"} Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.970106 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.970798 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.970820 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.970829 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.971548 4848 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="23fce4209f5b206783fc270f45f2df42f86d1a8d360149d545f6906ec5bfaaec" exitCode=0 Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.971601 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"23fce4209f5b206783fc270f45f2df42f86d1a8d360149d545f6906ec5bfaaec"} Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.971646 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"852195ee5d8d0015d698f66968dca8ccacba9364e85e1960af79613aca3db39c"} Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.971743 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.972659 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.973223 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.973244 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.973254 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.973502 4848 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ef067c0153864fe2c16687636ff110b01b354aee2e967a913846d25cc8475209" exitCode=0 Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.973537 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ef067c0153864fe2c16687636ff110b01b354aee2e967a913846d25cc8475209"} Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.973551 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"93eca2929d05eab5f81039108b506e3204a30832a58a8d54549a27eb36a5e7c6"} Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.973590 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.973664 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.973675 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.973681 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.974475 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.974488 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.974495 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.975832 4848 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a" exitCode=0 Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.975906 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a"} Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.975937 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"94e1c541b22fa7e083f6d5a6f9c8abbbbb6841361d5635f88e382eefa9325e60"} Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.976061 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.977100 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.977128 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.977139 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.981412 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0"} Dec 06 15:28:53 crc kubenswrapper[4848]: I1206 15:28:53.981442 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"06e49b326c2bd2e4702dc5ccd8171cc553d3a266f80b045522242cbea3d23ac1"} Dec 06 15:28:54 crc kubenswrapper[4848]: W1206 15:28:54.047715 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Dec 06 15:28:54 crc kubenswrapper[4848]: E1206 15:28:54.047803 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Dec 06 15:28:54 crc kubenswrapper[4848]: W1206 15:28:54.146663 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Dec 06 15:28:54 crc kubenswrapper[4848]: E1206 15:28:54.146762 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Dec 06 15:28:54 crc kubenswrapper[4848]: E1206 15:28:54.325854 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="1.6s" Dec 06 15:28:54 crc kubenswrapper[4848]: W1206 15:28:54.417026 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Dec 06 15:28:54 crc kubenswrapper[4848]: E1206 15:28:54.417128 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.539076 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.542005 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.542045 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.542056 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.542084 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 15:28:54 crc kubenswrapper[4848]: E1206 15:28:54.542516 4848 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.64:6443: connect: connection refused" node="crc" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.985616 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"35856252a9d2379f085bf64d1d43824da7c44ffb4766f237360d934120d55d79"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.985750 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.987143 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.987176 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.987186 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.989136 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c9a6a5b4eb903385cb761b52b1e4babf4024acf274822be64616d80e66a6c243"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.989161 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8be90f4cb06fb6e6d8fcfe736e1b874905e8b6e8e482120145cfbea106777873"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.989173 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6e8b0e830434d7b7b2f12ca82ef784eb9309d331ff9ec8459c08909632c076ab"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.989273 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.990070 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.990102 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.990112 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.992510 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.992533 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.992543 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.992519 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.993628 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.993664 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.993676 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.996416 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.996448 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.996459 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.996468 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.996476 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.996545 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.997203 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.997233 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.997242 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.998897 4848 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="72049b09786af68c31f5baef596ce4fc60f6068cd1090a1eaeeb06512890f17f" exitCode=0 Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.998931 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"72049b09786af68c31f5baef596ce4fc60f6068cd1090a1eaeeb06512890f17f"} Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.999035 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.999707 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.999735 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:54 crc kubenswrapper[4848]: I1206 15:28:54.999752 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:55 crc kubenswrapper[4848]: I1206 15:28:55.959666 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:55 crc kubenswrapper[4848]: I1206 15:28:55.968868 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.003828 4848 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="503f1213a67c7260d8fa6ec367e7b9d52293546649b7b45ac98776804602a81f" exitCode=0 Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.003887 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"503f1213a67c7260d8fa6ec367e7b9d52293546649b7b45ac98776804602a81f"} Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.004050 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.004088 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.004101 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.004052 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.005245 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.005272 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.005283 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.005363 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.005393 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.005437 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.005485 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.005510 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.005525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.143263 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.144324 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.144352 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.144361 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.144380 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 15:28:56 crc kubenswrapper[4848]: I1206 15:28:56.232260 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.012549 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"41ffb63874ff22521d2d1b91d48702d96916a9034f6889453fef6b07c370dbc6"} Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.012579 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.012592 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8c05a5234a7500c75f64f33c424f7fd8391d2997409450c123eef84cbca83164"} Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.012606 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c6beb6c4b1dbd0ea47f10e91a19616bbee902a56bc8790c110c3caa7cd930f1e"} Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.012615 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"64f66ce84f76fbe3b4a0bc1cd58a06cdd56823c13e4b38e8c08a92347aedfb4c"} Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.012617 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.012655 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.013686 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.013725 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.013738 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.013739 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.013758 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:57 crc kubenswrapper[4848]: I1206 15:28:57.013765 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.020798 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"56f50e01afba078eabbacf9da1844e4c998fee016a410bd39f5738ed864cf82c"} Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.020837 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.020902 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.020919 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.022347 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.022387 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.022400 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.023223 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.023272 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.023288 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.396894 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.405372 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.405584 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.405660 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.407318 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.407349 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.407357 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.658795 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:28:58 crc kubenswrapper[4848]: I1206 15:28:58.885571 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.023864 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.023917 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.023927 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.025316 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.025357 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.025403 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.025427 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.025371 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.025541 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.628572 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.628784 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.628835 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.632660 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.632831 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:28:59 crc kubenswrapper[4848]: I1206 15:28:59.632901 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:00 crc kubenswrapper[4848]: I1206 15:29:00.026327 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:29:00 crc kubenswrapper[4848]: I1206 15:29:00.027803 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:00 crc kubenswrapper[4848]: I1206 15:29:00.027850 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:00 crc kubenswrapper[4848]: I1206 15:29:00.027868 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:01 crc kubenswrapper[4848]: I1206 15:29:01.659400 4848 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 15:29:01 crc kubenswrapper[4848]: I1206 15:29:01.659492 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 15:29:02 crc kubenswrapper[4848]: I1206 15:29:02.143628 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:29:02 crc kubenswrapper[4848]: I1206 15:29:02.143939 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:29:02 crc kubenswrapper[4848]: I1206 15:29:02.145478 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:02 crc kubenswrapper[4848]: I1206 15:29:02.145542 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:02 crc kubenswrapper[4848]: I1206 15:29:02.145561 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:02 crc kubenswrapper[4848]: I1206 15:29:02.683725 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:29:02 crc kubenswrapper[4848]: I1206 15:29:02.683889 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:29:02 crc kubenswrapper[4848]: I1206 15:29:02.685168 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:02 crc kubenswrapper[4848]: I1206 15:29:02.685219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:02 crc kubenswrapper[4848]: I1206 15:29:02.685231 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:03 crc kubenswrapper[4848]: E1206 15:29:03.038294 4848 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 06 15:29:04 crc kubenswrapper[4848]: E1206 15:29:04.807944 4848 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.187ea9eb8037ed8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 15:28:52.909370762 +0000 UTC m=+0.207381665,LastTimestamp:2025-12-06 15:28:52.909370762 +0000 UTC m=+0.207381665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 15:29:04 crc kubenswrapper[4848]: I1206 15:29:04.914550 4848 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 06 15:29:05 crc kubenswrapper[4848]: I1206 15:29:05.849041 4848 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 06 15:29:05 crc kubenswrapper[4848]: I1206 15:29:05.849090 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 06 15:29:05 crc kubenswrapper[4848]: I1206 15:29:05.865867 4848 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 06 15:29:05 crc kubenswrapper[4848]: I1206 15:29:05.865939 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.412653 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.412943 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.414416 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.414473 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.414495 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.419890 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.912779 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.913000 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.914282 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.914347 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.914366 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:08 crc kubenswrapper[4848]: I1206 15:29:08.924145 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 06 15:29:09 crc kubenswrapper[4848]: I1206 15:29:09.047028 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:29:09 crc kubenswrapper[4848]: I1206 15:29:09.047093 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:29:09 crc kubenswrapper[4848]: I1206 15:29:09.048260 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:09 crc kubenswrapper[4848]: I1206 15:29:09.048311 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:09 crc kubenswrapper[4848]: I1206 15:29:09.048337 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:09 crc kubenswrapper[4848]: I1206 15:29:09.048347 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:09 crc kubenswrapper[4848]: I1206 15:29:09.048359 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:09 crc kubenswrapper[4848]: I1206 15:29:09.048368 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:10 crc kubenswrapper[4848]: E1206 15:29:10.844263 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.845923 4848 trace.go:236] Trace[1891296970]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 15:28:56.823) (total time: 14022ms): Dec 06 15:29:10 crc kubenswrapper[4848]: Trace[1891296970]: ---"Objects listed" error: 14022ms (15:29:10.845) Dec 06 15:29:10 crc kubenswrapper[4848]: Trace[1891296970]: [14.022621828s] [14.022621828s] END Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.845970 4848 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.846815 4848 trace.go:236] Trace[1209741951]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 15:28:56.362) (total time: 14484ms): Dec 06 15:29:10 crc kubenswrapper[4848]: Trace[1209741951]: ---"Objects listed" error: 14483ms (15:29:10.846) Dec 06 15:29:10 crc kubenswrapper[4848]: Trace[1209741951]: [14.484080865s] [14.484080865s] END Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.846845 4848 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 06 15:29:10 crc kubenswrapper[4848]: E1206 15:29:10.847197 4848 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.847226 4848 trace.go:236] Trace[1308717991]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 15:28:56.632) (total time: 14214ms): Dec 06 15:29:10 crc kubenswrapper[4848]: Trace[1308717991]: ---"Objects listed" error: 14214ms (15:29:10.847) Dec 06 15:29:10 crc kubenswrapper[4848]: Trace[1308717991]: [14.214269689s] [14.214269689s] END Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.847242 4848 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.847596 4848 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.848780 4848 trace.go:236] Trace[1186436527]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 15:28:56.097) (total time: 14750ms): Dec 06 15:29:10 crc kubenswrapper[4848]: Trace[1186436527]: ---"Objects listed" error: 14750ms (15:29:10.848) Dec 06 15:29:10 crc kubenswrapper[4848]: Trace[1186436527]: [14.750877321s] [14.750877321s] END Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.848803 4848 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.873741 4848 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:49514->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.873801 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:49514->192.168.126.11:17697: read: connection reset by peer" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.874144 4848 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:49526->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.874168 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:49526->192.168.126.11:17697: read: connection reset by peer" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.874255 4848 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.874370 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.916110 4848 apiserver.go:52] "Watching apiserver" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.932511 4848 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.932878 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.933421 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:10 crc kubenswrapper[4848]: E1206 15:29:10.933489 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.933549 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.933919 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.934191 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.934346 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 15:29:10 crc kubenswrapper[4848]: E1206 15:29:10.934338 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.934440 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:10 crc kubenswrapper[4848]: E1206 15:29:10.934484 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.935820 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.940215 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.940386 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.940521 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.940594 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.940644 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.940788 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.940814 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.940920 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.985163 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:10 crc kubenswrapper[4848]: I1206 15:29:10.998382 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.008200 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.022348 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.023060 4848 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.032942 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.041885 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.048883 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.048937 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.048960 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.048984 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049005 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049030 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049080 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049102 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049459 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049464 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049535 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049580 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049610 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049635 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049642 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049660 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.049932 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050070 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050110 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050245 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050279 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050501 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050515 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050573 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050604 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050612 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050648 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050524 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050987 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.050631 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.051152 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.051188 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.051220 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.051275 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.051295 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.051369 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.051640 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.051647 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052242 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.051313 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052296 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052313 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052332 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052354 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052377 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052400 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052425 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052449 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052472 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052493 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052560 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052580 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052622 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052638 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052672 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052718 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052766 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052791 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052861 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.052979 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053029 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053048 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053067 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053190 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053252 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053299 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053315 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053329 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053350 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053381 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053408 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053549 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053598 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.053828 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054030 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054157 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054196 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054302 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054399 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054498 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054472 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054635 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054859 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054914 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054931 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054952 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054968 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.054985 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055023 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055061 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055090 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055106 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055123 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055139 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055155 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055170 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055292 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055310 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055328 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055345 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055377 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055734 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.055904 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.056552 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.056642 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.056688 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.056820 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.056840 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.056857 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.056961 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.057317 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.057413 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.057447 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.058474 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.058636 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.058789 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.058857 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.058922 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.058961 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059051 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059145 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059248 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059250 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059216 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059393 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059438 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059488 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059583 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059800 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059826 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059866 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059900 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059934 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060019 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060049 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060104 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060129 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060268 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060326 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060401 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060470 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060501 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060525 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060554 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060631 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.063921 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059276 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059592 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059607 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.059585 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060181 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060372 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.060554 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.061199 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.061279 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.061305 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.061371 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.061501 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.061528 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.061826 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.061950 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.062211 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.062238 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.062598 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.062634 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.062065 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.062858 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:29:11.562837705 +0000 UTC m=+18.860848608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071682 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071749 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071775 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071796 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071828 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071853 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071874 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071904 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071928 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071950 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071977 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072004 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072024 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072042 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072063 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072085 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072104 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072126 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072150 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072173 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072197 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072228 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072273 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072307 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072339 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072371 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072415 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072449 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072489 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072526 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072557 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072584 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072592 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072627 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072660 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072707 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072743 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.072957 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073005 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073033 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073063 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073093 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073121 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073151 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073180 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073208 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073244 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073276 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073305 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073332 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073360 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073390 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073418 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073448 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073480 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073508 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073537 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073567 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073595 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073620 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073648 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073681 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073725 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073935 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073967 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.073995 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074022 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074049 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074078 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074103 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074133 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074160 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074186 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074216 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074207 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074250 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074284 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074311 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074357 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074384 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074416 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074446 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074473 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074504 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074537 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074567 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074592 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074622 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074655 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074682 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074730 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074761 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074789 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074817 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074903 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074944 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.074978 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075009 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075041 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075071 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075102 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075141 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075180 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075210 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075217 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075280 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075489 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075658 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.075832 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.076177 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.076324 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.076361 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.076681 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.063371 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.063614 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.063841 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.064204 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.064313 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.065873 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.066141 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.066153 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.066557 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.067195 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.067966 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.062613 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.068169 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.068175 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.069197 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.069401 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.069991 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.070460 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071297 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.071381 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.078070 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.078169 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.078240 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.078302 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079033 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.078526 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079080 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.078574 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.078820 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.078840 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.078851 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079017 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079249 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079434 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079531 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079754 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079771 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079787 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079826 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079841 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079855 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079904 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079919 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079930 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079941 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079953 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079963 4848 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.081446 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.081590 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.081834 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.081887 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.081903 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082261 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082445 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082760 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082794 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082821 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.079974 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082874 4848 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082888 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082901 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082913 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082924 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082935 4848 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082981 4848 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.082994 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083006 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083017 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083029 4848 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083040 4848 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083050 4848 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083053 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083061 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083088 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083104 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083121 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083135 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083147 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083159 4848 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083171 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083181 4848 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083198 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083208 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083202 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083284 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083297 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083309 4848 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083320 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083331 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083342 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083354 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083363 4848 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083375 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083385 4848 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083396 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083407 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083419 4848 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083429 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083441 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083451 4848 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083463 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083474 4848 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083486 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083497 4848 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083507 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083516 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083528 4848 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083537 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083547 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083557 4848 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083567 4848 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083576 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083586 4848 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083596 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083607 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083617 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083627 4848 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083638 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083649 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083659 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083719 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083733 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083747 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083757 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083770 4848 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083783 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083796 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083809 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083822 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083835 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083851 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083862 4848 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083874 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083886 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083900 4848 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083916 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083928 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083941 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083953 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083966 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083982 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083996 4848 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084010 4848 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084024 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084038 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084059 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084033 4848 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084451 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9" exitCode=255 Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.085022 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9"} Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084070 4848 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.087857 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.087872 4848 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083116 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083149 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083330 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083392 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083614 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083537 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083668 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083747 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.083879 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084095 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084137 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084154 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084152 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084236 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084374 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084568 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084630 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084684 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.084891 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.085140 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.085584 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.085675 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.085934 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.086250 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.086529 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.086951 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.087399 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.087542 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.087742 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.086021 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.089482 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.089526 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.089888 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.089907 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.090123 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.090252 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.090421 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.090675 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.090728 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.091133 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.091170 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.091197 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.091236 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:11.591214259 +0000 UTC m=+18.889225172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.091314 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.091313 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.091213 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.091344 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:11.591338012 +0000 UTC m=+18.889348925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.091580 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.091766 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.092273 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.092460 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.093111 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.093239 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.094156 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.094260 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.094274 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.094270 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.094551 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.095662 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.096455 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.099142 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.100160 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.100204 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.100222 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.100302 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:11.600283251 +0000 UTC m=+18.898294154 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.101476 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.101788 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.101912 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.102128 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.102167 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.102256 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.102583 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.103093 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.103241 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.102299 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.103657 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.104008 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.104044 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.104388 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.105173 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.105324 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.105459 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.106045 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.106135 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.110280 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.110828 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.110902 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.112853 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.113733 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.113851 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.114435 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.114474 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.114735 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.114810 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:11.614788618 +0000 UTC m=+18.912799531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.114977 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.117465 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.119113 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.122211 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.124591 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.126804 4848 scope.go:117] "RemoveContainer" containerID="b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.128386 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.129439 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.147476 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.150491 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.153091 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.163892 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.176200 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.187318 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.188716 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.188755 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.188797 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.188808 4848 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.188946 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.188958 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.188966 4848 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.188975 4848 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.188984 4848 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.188992 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189001 4848 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189010 4848 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189020 4848 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189028 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189037 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189049 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189060 4848 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189069 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189078 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189086 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189096 4848 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189104 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189112 4848 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189121 4848 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189129 4848 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189138 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189148 4848 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189156 4848 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189164 4848 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189173 4848 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189181 4848 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189190 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189200 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189209 4848 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189217 4848 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189225 4848 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189233 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189241 4848 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189427 4848 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189437 4848 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189445 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189453 4848 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189462 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189470 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189479 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189488 4848 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189592 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189601 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189613 4848 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189638 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189650 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189658 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189667 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189676 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189685 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189709 4848 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189717 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189726 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189737 4848 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189772 4848 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189786 4848 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189797 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189805 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189816 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189825 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189833 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189841 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189849 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189858 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189866 4848 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189874 4848 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189882 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189892 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189899 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189907 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189916 4848 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189924 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189933 4848 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189941 4848 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189951 4848 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189959 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.189967 4848 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190002 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190011 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190024 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190034 4848 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190155 4848 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190164 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190172 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190180 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190188 4848 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190196 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190204 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.190254 4848 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.191352 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.191517 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.248663 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.255569 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.263032 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 15:29:11 crc kubenswrapper[4848]: W1206 15:29:11.272879 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-7c01fa0e7efc98cef0e14b9c2b3b9036ae76efa592a2a02538068a257a05b043 WatchSource:0}: Error finding container 7c01fa0e7efc98cef0e14b9c2b3b9036ae76efa592a2a02538068a257a05b043: Status 404 returned error can't find the container with id 7c01fa0e7efc98cef0e14b9c2b3b9036ae76efa592a2a02538068a257a05b043 Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.593570 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.593949 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:29:12.593882248 +0000 UTC m=+19.891893201 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.594192 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.594323 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.594374 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.594436 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.594566 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:12.594541287 +0000 UTC m=+19.892552200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.594674 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:12.594662341 +0000 UTC m=+19.892673244 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.607900 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.611575 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.613093 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.621960 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.626836 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.638209 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.654079 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.664673 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.674444 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.685598 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.695238 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.695296 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.695607 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.695667 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.695687 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.695629 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.695791 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:12.695763765 +0000 UTC m=+19.993774688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.695807 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.695836 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:11 crc kubenswrapper[4848]: E1206 15:29:11.695918 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:12.695896959 +0000 UTC m=+19.993907912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.699689 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.728509 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.762012 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.777617 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.800627 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.812312 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qk7cq"] Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.812828 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qk7cq" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.817081 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.817559 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.817678 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.817814 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.819313 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-b5fcj"] Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.819562 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b5fcj" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.821092 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.821375 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.823518 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.831221 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.853906 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.870167 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.887261 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.896829 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e71fffe2-9e92-49ff-9a8e-6b08e2946b23-serviceca\") pod \"node-ca-qk7cq\" (UID: \"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\") " pod="openshift-image-registry/node-ca-qk7cq" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.896870 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e71fffe2-9e92-49ff-9a8e-6b08e2946b23-host\") pod \"node-ca-qk7cq\" (UID: \"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\") " pod="openshift-image-registry/node-ca-qk7cq" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.896886 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5eb75de0-7ecd-4b1a-8322-51af09d62176-hosts-file\") pod \"node-resolver-b5fcj\" (UID: \"5eb75de0-7ecd-4b1a-8322-51af09d62176\") " pod="openshift-dns/node-resolver-b5fcj" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.896911 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pmv9\" (UniqueName: \"kubernetes.io/projected/e71fffe2-9e92-49ff-9a8e-6b08e2946b23-kube-api-access-8pmv9\") pod \"node-ca-qk7cq\" (UID: \"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\") " pod="openshift-image-registry/node-ca-qk7cq" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.896936 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96lgj\" (UniqueName: \"kubernetes.io/projected/5eb75de0-7ecd-4b1a-8322-51af09d62176-kube-api-access-96lgj\") pod \"node-resolver-b5fcj\" (UID: \"5eb75de0-7ecd-4b1a-8322-51af09d62176\") " pod="openshift-dns/node-resolver-b5fcj" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.914880 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.936478 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.964908 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.986787 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:11Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.998300 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e71fffe2-9e92-49ff-9a8e-6b08e2946b23-host\") pod \"node-ca-qk7cq\" (UID: \"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\") " pod="openshift-image-registry/node-ca-qk7cq" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.998378 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5eb75de0-7ecd-4b1a-8322-51af09d62176-hosts-file\") pod \"node-resolver-b5fcj\" (UID: \"5eb75de0-7ecd-4b1a-8322-51af09d62176\") " pod="openshift-dns/node-resolver-b5fcj" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.998418 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pmv9\" (UniqueName: \"kubernetes.io/projected/e71fffe2-9e92-49ff-9a8e-6b08e2946b23-kube-api-access-8pmv9\") pod \"node-ca-qk7cq\" (UID: \"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\") " pod="openshift-image-registry/node-ca-qk7cq" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.998450 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96lgj\" (UniqueName: \"kubernetes.io/projected/5eb75de0-7ecd-4b1a-8322-51af09d62176-kube-api-access-96lgj\") pod \"node-resolver-b5fcj\" (UID: \"5eb75de0-7ecd-4b1a-8322-51af09d62176\") " pod="openshift-dns/node-resolver-b5fcj" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.998478 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e71fffe2-9e92-49ff-9a8e-6b08e2946b23-serviceca\") pod \"node-ca-qk7cq\" (UID: \"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\") " pod="openshift-image-registry/node-ca-qk7cq" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.998478 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5eb75de0-7ecd-4b1a-8322-51af09d62176-hosts-file\") pod \"node-resolver-b5fcj\" (UID: \"5eb75de0-7ecd-4b1a-8322-51af09d62176\") " pod="openshift-dns/node-resolver-b5fcj" Dec 06 15:29:11 crc kubenswrapper[4848]: I1206 15:29:11.998763 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e71fffe2-9e92-49ff-9a8e-6b08e2946b23-host\") pod \"node-ca-qk7cq\" (UID: \"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\") " pod="openshift-image-registry/node-ca-qk7cq" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.000338 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e71fffe2-9e92-49ff-9a8e-6b08e2946b23-serviceca\") pod \"node-ca-qk7cq\" (UID: \"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\") " pod="openshift-image-registry/node-ca-qk7cq" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.003122 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.015576 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pmv9\" (UniqueName: \"kubernetes.io/projected/e71fffe2-9e92-49ff-9a8e-6b08e2946b23-kube-api-access-8pmv9\") pod \"node-ca-qk7cq\" (UID: \"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\") " pod="openshift-image-registry/node-ca-qk7cq" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.018141 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96lgj\" (UniqueName: \"kubernetes.io/projected/5eb75de0-7ecd-4b1a-8322-51af09d62176-kube-api-access-96lgj\") pod \"node-resolver-b5fcj\" (UID: \"5eb75de0-7ecd-4b1a-8322-51af09d62176\") " pod="openshift-dns/node-resolver-b5fcj" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.020559 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.035646 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.050736 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.076015 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.088501 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2"} Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.088547 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d"} Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.088556 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"60cba72cc7bc917151fad700a81c19daa28dc772c2cfefc507ee5e89b2f76ec9"} Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.089366 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"972b60085a36914177ad29f12101590da3edec3f6c7f33ce6c0900d0f7ed17d6"} Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.090424 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad"} Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.090473 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7c01fa0e7efc98cef0e14b9c2b3b9036ae76efa592a2a02538068a257a05b043"} Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.092535 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.092642 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.094248 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c"} Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.105215 4848 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.114637 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.126888 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qk7cq" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.129873 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.134801 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b5fcj" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.145247 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.149018 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: W1206 15:29:12.152806 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eb75de0_7ecd_4b1a_8322_51af09d62176.slice/crio-21fdd632a7d0d739ec8adb0dcc12d473624fd27d1e5bd5dfe449a2f250d961a3 WatchSource:0}: Error finding container 21fdd632a7d0d739ec8adb0dcc12d473624fd27d1e5bd5dfe449a2f250d961a3: Status 404 returned error can't find the container with id 21fdd632a7d0d739ec8adb0dcc12d473624fd27d1e5bd5dfe449a2f250d961a3 Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.164493 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.178454 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.196028 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.217031 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.217265 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-7mrg5"] Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.217631 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.221987 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.222180 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.222358 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.222549 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.222570 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.240875 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.251036 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.263857 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.279720 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.293444 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.301515 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fc8499a5-41f5-49e8-a206-3240532ec6a0-rootfs\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.301574 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl9fs\" (UniqueName: \"kubernetes.io/projected/fc8499a5-41f5-49e8-a206-3240532ec6a0-kube-api-access-pl9fs\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.301600 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc8499a5-41f5-49e8-a206-3240532ec6a0-mcd-auth-proxy-config\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.301624 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc8499a5-41f5-49e8-a206-3240532ec6a0-proxy-tls\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.308036 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.322177 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.333507 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.352370 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.364598 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.379095 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.395002 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.402416 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc8499a5-41f5-49e8-a206-3240532ec6a0-mcd-auth-proxy-config\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.402453 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc8499a5-41f5-49e8-a206-3240532ec6a0-proxy-tls\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.402504 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fc8499a5-41f5-49e8-a206-3240532ec6a0-rootfs\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.402524 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl9fs\" (UniqueName: \"kubernetes.io/projected/fc8499a5-41f5-49e8-a206-3240532ec6a0-kube-api-access-pl9fs\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.402746 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fc8499a5-41f5-49e8-a206-3240532ec6a0-rootfs\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.404598 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc8499a5-41f5-49e8-a206-3240532ec6a0-mcd-auth-proxy-config\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.409263 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc8499a5-41f5-49e8-a206-3240532ec6a0-proxy-tls\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.416298 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.425436 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl9fs\" (UniqueName: \"kubernetes.io/projected/fc8499a5-41f5-49e8-a206-3240532ec6a0-kube-api-access-pl9fs\") pod \"machine-config-daemon-7mrg5\" (UID: \"fc8499a5-41f5-49e8-a206-3240532ec6a0\") " pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.433285 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.535927 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:29:12 crc kubenswrapper[4848]: W1206 15:29:12.548724 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc8499a5_41f5_49e8_a206_3240532ec6a0.slice/crio-ec1f21ff05b4efdd26938b5175dbc801768ae3f72910f716ce11faef91493173 WatchSource:0}: Error finding container ec1f21ff05b4efdd26938b5175dbc801768ae3f72910f716ce11faef91493173: Status 404 returned error can't find the container with id ec1f21ff05b4efdd26938b5175dbc801768ae3f72910f716ce11faef91493173 Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.604947 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.605100 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.605139 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.605167 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:29:14.605134225 +0000 UTC m=+21.903145148 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.605245 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.605305 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:14.605286789 +0000 UTC m=+21.903297702 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.605320 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.605453 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:14.605425564 +0000 UTC m=+21.903436537 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.706134 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.706243 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.706350 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.706382 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.706395 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.706422 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.706453 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.706474 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.706458 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:14.706439094 +0000 UTC m=+22.004450067 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.706617 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:14.706560759 +0000 UTC m=+22.004571672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.965792 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.965811 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.965902 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.965969 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.966054 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:12 crc kubenswrapper[4848]: E1206 15:29:12.966151 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.973259 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.974513 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.975361 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.977047 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.977677 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.978955 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.979924 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.989291 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.991033 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.992289 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.993022 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.997746 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.998899 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 06 15:29:12 crc kubenswrapper[4848]: I1206 15:29:12.999689 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.001029 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.001907 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.003672 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.004303 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.005178 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.006759 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.007455 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.008360 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.011170 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.012621 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.013563 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.019461 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.020827 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.022028 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.022789 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.024033 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.024643 4848 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.024845 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.027469 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.028187 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.028727 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.031289 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.033192 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.034160 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.037271 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.038159 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.039309 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.040109 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.041245 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.042391 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.043089 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.044325 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.044904 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.045677 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.047357 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.048189 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.051371 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.052158 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.053182 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.054507 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.055227 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.057956 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-qx6m8"] Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.058366 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zmmx7"] Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.058983 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8g4jc"] Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.059862 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.059957 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.060155 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.067068 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.067236 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.067388 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.067485 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.067932 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.068116 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.075138 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.075350 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.077123 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.077250 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.077400 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.077540 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.077636 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.077794 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.077899 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.114498 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d"} Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.114542 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff"} Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.114555 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"ec1f21ff05b4efdd26938b5175dbc801768ae3f72910f716ce11faef91493173"} Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.117927 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.128970 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b5fcj" event={"ID":"5eb75de0-7ecd-4b1a-8322-51af09d62176","Type":"ContainerStarted","Data":"ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622"} Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.129018 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b5fcj" event={"ID":"5eb75de0-7ecd-4b1a-8322-51af09d62176","Type":"ContainerStarted","Data":"21fdd632a7d0d739ec8adb0dcc12d473624fd27d1e5bd5dfe449a2f250d961a3"} Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.140318 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.141891 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qk7cq" event={"ID":"e71fffe2-9e92-49ff-9a8e-6b08e2946b23","Type":"ContainerStarted","Data":"7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3"} Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.141943 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qk7cq" event={"ID":"e71fffe2-9e92-49ff-9a8e-6b08e2946b23","Type":"ContainerStarted","Data":"60275880784eecc15e7babf7df1c0d539af01439a45997ec4e3c416c0dd5849d"} Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.173038 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.189325 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.199303 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.209689 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6dad9b9-172f-494c-adb1-da5c45b89ebd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.209762 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-conf-dir\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.209798 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-netd\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.209835 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-var-lib-kubelet\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.209852 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-etc-openvswitch\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.209884 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-hostroot\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.209899 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-var-lib-openvswitch\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.209921 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-system-cni-dir\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.209956 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-run-multus-certs\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.209981 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kr6h\" (UniqueName: \"kubernetes.io/projected/9c409d16-f97d-4bcd-bf25-b80af1b16922-kube-api-access-9kr6h\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210006 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-system-cni-dir\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210046 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c409d16-f97d-4bcd-bf25-b80af1b16922-cni-binary-copy\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210066 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-netns\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210120 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-slash\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210140 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6dad9b9-172f-494c-adb1-da5c45b89ebd-cni-binary-copy\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210157 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-run-k8s-cni-cncf-io\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210172 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-node-log\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210208 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zznnm\" (UniqueName: \"kubernetes.io/projected/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-kube-api-access-zznnm\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210307 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-ovn-kubernetes\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210380 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-config\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210421 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-ovn\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210466 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-socket-dir-parent\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210571 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-daemon-config\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210619 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-systemd-units\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210647 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-script-lib\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210715 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k855v\" (UniqueName: \"kubernetes.io/projected/a6dad9b9-172f-494c-adb1-da5c45b89ebd-kube-api-access-k855v\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210734 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-var-lib-cni-bin\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210726 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210750 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-bin\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210884 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovn-node-metrics-cert\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210927 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-os-release\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210950 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-openvswitch\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.210965 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-os-release\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211001 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-log-socket\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211116 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211149 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-cni-dir\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211169 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-env-overrides\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211201 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-etc-kubernetes\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211220 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-run-netns\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211234 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-var-lib-cni-multus\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211251 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-kubelet\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211270 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-cnibin\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211283 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-systemd\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211297 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.211335 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-cnibin\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.223436 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.238058 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.250915 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.268402 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.282821 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.297801 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.311491 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.311913 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-var-lib-kubelet\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.311961 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-etc-openvswitch\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.311979 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-hostroot\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.311996 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-var-lib-openvswitch\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312014 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-system-cni-dir\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312031 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-var-lib-kubelet\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312043 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-etc-openvswitch\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312054 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-hostroot\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312045 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-run-multus-certs\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312078 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-run-multus-certs\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312085 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-var-lib-openvswitch\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312111 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kr6h\" (UniqueName: \"kubernetes.io/projected/9c409d16-f97d-4bcd-bf25-b80af1b16922-kube-api-access-9kr6h\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312131 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-system-cni-dir\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312168 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-system-cni-dir\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312188 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c409d16-f97d-4bcd-bf25-b80af1b16922-cni-binary-copy\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312202 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-netns\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312225 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-slash\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312240 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6dad9b9-172f-494c-adb1-da5c45b89ebd-cni-binary-copy\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312255 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-run-k8s-cni-cncf-io\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312268 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-node-log\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312281 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zznnm\" (UniqueName: \"kubernetes.io/projected/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-kube-api-access-zznnm\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312299 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-slash\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312304 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-ovn-kubernetes\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312317 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-config\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312331 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-ovn\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312340 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-system-cni-dir\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312346 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-socket-dir-parent\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312381 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-socket-dir-parent\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312397 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-daemon-config\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312420 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-systemd-units\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312440 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-script-lib\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312481 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k855v\" (UniqueName: \"kubernetes.io/projected/a6dad9b9-172f-494c-adb1-da5c45b89ebd-kube-api-access-k855v\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312503 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-var-lib-cni-bin\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312519 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-bin\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312534 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovn-node-metrics-cert\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312554 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-os-release\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312569 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-openvswitch\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312585 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-os-release\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312602 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-log-socket\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312627 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312641 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-cni-dir\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312656 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-env-overrides\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312679 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-etc-kubernetes\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312713 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-run-netns\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312728 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-var-lib-cni-multus\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312743 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-kubelet\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312762 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-cnibin\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312777 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-systemd\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312793 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312818 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-cnibin\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312833 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6dad9b9-172f-494c-adb1-da5c45b89ebd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312848 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-conf-dir\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312854 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c409d16-f97d-4bcd-bf25-b80af1b16922-cni-binary-copy\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312862 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-netd\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312889 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-netd\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312912 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-netns\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312912 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-systemd-units\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312912 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6dad9b9-172f-494c-adb1-da5c45b89ebd-cni-binary-copy\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312940 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-run-k8s-cni-cncf-io\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.312968 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-node-log\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313040 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-daemon-config\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313126 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-ovn\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313164 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-ovn-kubernetes\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313522 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-config\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313568 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-script-lib\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313580 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-env-overrides\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313608 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-etc-kubernetes\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313635 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-run-netns\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313656 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-var-lib-cni-multus\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313680 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-kubelet\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313687 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-host-var-lib-cni-bin\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313735 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-bin\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313737 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-openvswitch\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313754 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-systemd\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313757 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-os-release\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313736 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-cnibin\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313795 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-cnibin\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313816 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-os-release\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313830 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.313846 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-log-socket\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.314091 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-cni-dir\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.314180 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6dad9b9-172f-494c-adb1-da5c45b89ebd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.314203 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6dad9b9-172f-494c-adb1-da5c45b89ebd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.314235 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c409d16-f97d-4bcd-bf25-b80af1b16922-multus-conf-dir\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.322474 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovn-node-metrics-cert\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.326513 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.329143 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kr6h\" (UniqueName: \"kubernetes.io/projected/9c409d16-f97d-4bcd-bf25-b80af1b16922-kube-api-access-9kr6h\") pod \"multus-qx6m8\" (UID: \"9c409d16-f97d-4bcd-bf25-b80af1b16922\") " pod="openshift-multus/multus-qx6m8" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.330682 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zznnm\" (UniqueName: \"kubernetes.io/projected/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-kube-api-access-zznnm\") pod \"ovnkube-node-8g4jc\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.340318 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.344029 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k855v\" (UniqueName: \"kubernetes.io/projected/a6dad9b9-172f-494c-adb1-da5c45b89ebd-kube-api-access-k855v\") pod \"multus-additional-cni-plugins-zmmx7\" (UID: \"a6dad9b9-172f-494c-adb1-da5c45b89ebd\") " pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.355376 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.367592 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.378147 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.394989 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.410000 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.425322 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.429281 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.438963 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.440953 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" Dec 06 15:29:13 crc kubenswrapper[4848]: W1206 15:29:13.443265 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f17f1f8_8f7c_41bc_bccf_dafeeb0b7135.slice/crio-19da850c4e2a55d8aad805fee1f96c94045408809464b9ba54f7dd2e1ccf068d WatchSource:0}: Error finding container 19da850c4e2a55d8aad805fee1f96c94045408809464b9ba54f7dd2e1ccf068d: Status 404 returned error can't find the container with id 19da850c4e2a55d8aad805fee1f96c94045408809464b9ba54f7dd2e1ccf068d Dec 06 15:29:13 crc kubenswrapper[4848]: W1206 15:29:13.452633 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6dad9b9_172f_494c_adb1_da5c45b89ebd.slice/crio-d0529f1d934ca9530e320829612077f649188ee4e09a3fe157d79abb0d74188d WatchSource:0}: Error finding container d0529f1d934ca9530e320829612077f649188ee4e09a3fe157d79abb0d74188d: Status 404 returned error can't find the container with id d0529f1d934ca9530e320829612077f649188ee4e09a3fe157d79abb0d74188d Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.460536 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:13 crc kubenswrapper[4848]: I1206 15:29:13.463618 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qx6m8" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.047347 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.049802 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.049835 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.049848 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.049993 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.058146 4848 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.058401 4848 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.059641 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.059691 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.059721 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.059743 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.059757 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.082417 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.088781 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.088822 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.088836 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.088854 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.088866 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.106203 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.109969 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.110162 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.110260 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.110355 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.110449 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.124606 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.129921 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.129971 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.129981 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.129998 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.130010 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.145132 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.149583 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.149644 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.149659 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.149691 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.149754 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.154687 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.155840 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8" exitCode=0 Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.156012 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.156047 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"19da850c4e2a55d8aad805fee1f96c94045408809464b9ba54f7dd2e1ccf068d"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.158018 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qx6m8" event={"ID":"9c409d16-f97d-4bcd-bf25-b80af1b16922","Type":"ContainerStarted","Data":"7def33d6f4bc96fe9e73806e6daf08ef5ac8e56d717bd4c7b3d8211adf193288"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.160963 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" event={"ID":"a6dad9b9-172f-494c-adb1-da5c45b89ebd","Type":"ContainerStarted","Data":"91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.161011 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" event={"ID":"a6dad9b9-172f-494c-adb1-da5c45b89ebd","Type":"ContainerStarted","Data":"d0529f1d934ca9530e320829612077f649188ee4e09a3fe157d79abb0d74188d"} Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.168595 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.168983 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.171523 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.171672 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.171790 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.171922 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.171806 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.172031 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.199735 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.223832 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.263661 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.277733 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.277780 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.277788 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.277803 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.277813 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.288052 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.298312 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.325722 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.344170 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.357323 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.377406 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.389035 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.389078 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.389090 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.389107 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.389118 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.393675 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.412193 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.428855 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.441810 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.454666 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.469146 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.482987 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.492160 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.492200 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.492212 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.492230 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.492242 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.495547 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.511837 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.523893 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.540352 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.554674 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.566466 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.582884 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.594151 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.594188 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.594199 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.594215 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.594228 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.599041 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.613236 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.626243 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.630125 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.630272 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.630335 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.630420 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.630488 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:18.630472535 +0000 UTC m=+25.928483448 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.630557 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:29:18.630546858 +0000 UTC m=+25.928557771 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.630636 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.630673 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:18.630664271 +0000 UTC m=+25.928675184 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.638368 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:14Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.696011 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.696659 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.696674 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.696692 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.696726 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.730914 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.730966 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.731080 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.731098 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.731108 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.731142 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:18.731129885 +0000 UTC m=+26.029140798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.731184 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.731195 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.731202 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.731220 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:18.731214108 +0000 UTC m=+26.029225021 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.800284 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.800328 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.800337 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.800373 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.800387 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.903644 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.903688 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.903716 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.903742 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.903755 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:14Z","lastTransitionTime":"2025-12-06T15:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.965730 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.965780 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.965868 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.965922 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:14 crc kubenswrapper[4848]: I1206 15:29:14.966889 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:14 crc kubenswrapper[4848]: E1206 15:29:14.967071 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.006226 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.006279 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.006291 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.006311 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.006325 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:15Z","lastTransitionTime":"2025-12-06T15:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.109860 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.110750 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.111114 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.111445 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.111538 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:15Z","lastTransitionTime":"2025-12-06T15:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.168199 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.168265 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.168286 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.168306 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.168325 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.168343 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.169508 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qx6m8" event={"ID":"9c409d16-f97d-4bcd-bf25-b80af1b16922","Type":"ContainerStarted","Data":"56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.171161 4848 generic.go:334] "Generic (PLEG): container finished" podID="a6dad9b9-172f-494c-adb1-da5c45b89ebd" containerID="91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb" exitCode=0 Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.171247 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" event={"ID":"a6dad9b9-172f-494c-adb1-da5c45b89ebd","Type":"ContainerDied","Data":"91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.187045 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.198580 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.209034 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.213787 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.213826 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.213839 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.213856 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.213868 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:15Z","lastTransitionTime":"2025-12-06T15:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.222338 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.236331 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.247402 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.259917 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.271230 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.280389 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.292345 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.303361 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.312299 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.321370 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.321414 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.321425 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.321444 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.321456 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:15Z","lastTransitionTime":"2025-12-06T15:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.328622 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.342104 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.359767 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.378015 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.391840 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.407386 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.422444 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.423636 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.423707 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.423716 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.423732 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.423745 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:15Z","lastTransitionTime":"2025-12-06T15:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.436791 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.454401 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.470846 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.503554 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.526335 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.526373 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.526394 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.526413 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.526425 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:15Z","lastTransitionTime":"2025-12-06T15:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.535587 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.575525 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.613764 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.628839 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.628872 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.628883 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.628899 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.628909 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:15Z","lastTransitionTime":"2025-12-06T15:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.652283 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.718002 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:15Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.730735 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.730787 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.730804 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.730822 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.730835 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:15Z","lastTransitionTime":"2025-12-06T15:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.832634 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.832672 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.832682 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.832710 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.832721 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:15Z","lastTransitionTime":"2025-12-06T15:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.934892 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.934925 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.934934 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.934947 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:15 crc kubenswrapper[4848]: I1206 15:29:15.934958 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:15Z","lastTransitionTime":"2025-12-06T15:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.037338 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.037379 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.037392 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.037410 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.037424 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:16Z","lastTransitionTime":"2025-12-06T15:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.139611 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.139672 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.139690 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.139750 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.139771 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:16Z","lastTransitionTime":"2025-12-06T15:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.178550 4848 generic.go:334] "Generic (PLEG): container finished" podID="a6dad9b9-172f-494c-adb1-da5c45b89ebd" containerID="510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6" exitCode=0 Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.178616 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" event={"ID":"a6dad9b9-172f-494c-adb1-da5c45b89ebd","Type":"ContainerDied","Data":"510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.194229 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.207941 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.222483 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.243112 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.243189 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.243209 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.243240 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.243264 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:16Z","lastTransitionTime":"2025-12-06T15:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.245986 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.262976 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.276156 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.293401 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.307954 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.323162 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.338531 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.345844 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.345887 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.345903 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.345926 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.345941 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:16Z","lastTransitionTime":"2025-12-06T15:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.350338 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.375058 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.394096 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.416468 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:16Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.448261 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.448294 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.448304 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.448316 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.448325 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:16Z","lastTransitionTime":"2025-12-06T15:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.550740 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.550773 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.550784 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.550797 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.550806 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:16Z","lastTransitionTime":"2025-12-06T15:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.653266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.653375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.653395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.653428 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.653448 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:16Z","lastTransitionTime":"2025-12-06T15:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.756118 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.756155 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.756165 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.756177 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.756185 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:16Z","lastTransitionTime":"2025-12-06T15:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.858774 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.858812 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.858823 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.858841 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.858853 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:16Z","lastTransitionTime":"2025-12-06T15:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.961751 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.961781 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.961789 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.961802 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.961811 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:16Z","lastTransitionTime":"2025-12-06T15:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.966258 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.966297 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:16 crc kubenswrapper[4848]: I1206 15:29:16.966357 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:16 crc kubenswrapper[4848]: E1206 15:29:16.966452 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:16 crc kubenswrapper[4848]: E1206 15:29:16.966629 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:16 crc kubenswrapper[4848]: E1206 15:29:16.971906 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.063603 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.063638 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.063646 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.063659 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.063667 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:17Z","lastTransitionTime":"2025-12-06T15:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.166652 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.166763 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.166788 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.166828 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.166853 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:17Z","lastTransitionTime":"2025-12-06T15:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.185660 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659"} Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.187714 4848 generic.go:334] "Generic (PLEG): container finished" podID="a6dad9b9-172f-494c-adb1-da5c45b89ebd" containerID="e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b" exitCode=0 Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.187748 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" event={"ID":"a6dad9b9-172f-494c-adb1-da5c45b89ebd","Type":"ContainerDied","Data":"e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b"} Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.209962 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.228971 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.241623 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.255724 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.268733 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.271238 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.271286 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.271300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.271324 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.271337 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:17Z","lastTransitionTime":"2025-12-06T15:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.280428 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.288393 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.304392 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.316956 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.330566 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.343758 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.362144 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.373869 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.373911 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.373924 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.373941 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.373952 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:17Z","lastTransitionTime":"2025-12-06T15:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.375223 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.390017 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:17Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.480421 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.480878 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.480891 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.480911 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.480922 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:17Z","lastTransitionTime":"2025-12-06T15:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.583521 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.583557 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.583566 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.583581 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.583592 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:17Z","lastTransitionTime":"2025-12-06T15:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.685344 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.685403 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.685414 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.685430 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.685442 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:17Z","lastTransitionTime":"2025-12-06T15:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.795420 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.795492 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.795504 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.795528 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.795544 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:17Z","lastTransitionTime":"2025-12-06T15:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.898913 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.899014 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.899042 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.899075 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:17 crc kubenswrapper[4848]: I1206 15:29:17.899095 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:17Z","lastTransitionTime":"2025-12-06T15:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.003024 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.003070 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.003082 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.003099 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.003109 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:18Z","lastTransitionTime":"2025-12-06T15:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.105055 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.105094 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.105107 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.105124 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.105135 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:18Z","lastTransitionTime":"2025-12-06T15:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.193829 4848 generic.go:334] "Generic (PLEG): container finished" podID="a6dad9b9-172f-494c-adb1-da5c45b89ebd" containerID="9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12" exitCode=0 Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.193865 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" event={"ID":"a6dad9b9-172f-494c-adb1-da5c45b89ebd","Type":"ContainerDied","Data":"9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.206428 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.207352 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.207379 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.207388 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.207402 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.207412 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:18Z","lastTransitionTime":"2025-12-06T15:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.217420 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.229264 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.240623 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.252360 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.263768 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.278751 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.293013 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.302715 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.309413 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.309513 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.309569 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.309624 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.309691 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:18Z","lastTransitionTime":"2025-12-06T15:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.312467 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.332793 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.351037 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.366527 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.382603 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:18Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.412123 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.412159 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.412169 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.412182 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.412192 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:18Z","lastTransitionTime":"2025-12-06T15:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.515050 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.515301 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.515360 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.515417 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.515483 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:18Z","lastTransitionTime":"2025-12-06T15:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.618357 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.618395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.618409 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.618425 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.618437 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:18Z","lastTransitionTime":"2025-12-06T15:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.670828 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.671080 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:29:26.67104092 +0000 UTC m=+33.969051863 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.671257 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.671355 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.671420 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:26.671404131 +0000 UTC m=+33.969415084 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.671507 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.671554 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.671793 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:26.671772402 +0000 UTC m=+33.969783315 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.720850 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.720886 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.720895 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.720910 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.720921 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:18Z","lastTransitionTime":"2025-12-06T15:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.772985 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.773041 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.773162 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.773162 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.773179 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.773193 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.773197 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.773205 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.773246 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:26.773230555 +0000 UTC m=+34.071241478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.773263 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:26.773255716 +0000 UTC m=+34.071266639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.822503 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.822713 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.822815 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.822885 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.822963 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:18Z","lastTransitionTime":"2025-12-06T15:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.926252 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.926283 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.926290 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.926303 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.926311 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:18Z","lastTransitionTime":"2025-12-06T15:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.966017 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.966039 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:18 crc kubenswrapper[4848]: I1206 15:29:18.966048 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.966187 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.966248 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:18 crc kubenswrapper[4848]: E1206 15:29:18.966316 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.028939 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.028988 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.028997 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.029012 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.029023 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:19Z","lastTransitionTime":"2025-12-06T15:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.131315 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.131351 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.131360 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.131372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.131380 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:19Z","lastTransitionTime":"2025-12-06T15:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.199618 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e"} Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.200513 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.200566 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.204046 4848 generic.go:334] "Generic (PLEG): container finished" podID="a6dad9b9-172f-494c-adb1-da5c45b89ebd" containerID="dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2" exitCode=0 Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.204075 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" event={"ID":"a6dad9b9-172f-494c-adb1-da5c45b89ebd","Type":"ContainerDied","Data":"dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2"} Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.212434 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.222426 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.228688 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.229410 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.233460 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.233478 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.233487 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.233500 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.233510 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:19Z","lastTransitionTime":"2025-12-06T15:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.243794 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.255402 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.268780 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.283633 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.296303 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.308727 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.317242 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.328333 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.336124 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.336153 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.336161 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.336175 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.336183 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:19Z","lastTransitionTime":"2025-12-06T15:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.338778 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.348645 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.364527 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.378977 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.389445 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.400002 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.410719 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.420043 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.431030 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.438224 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.438264 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.438274 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.438290 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.438301 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:19Z","lastTransitionTime":"2025-12-06T15:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.442591 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.452652 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.461267 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.472084 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.482147 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.490963 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.505471 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.517092 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.527732 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:19Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.540288 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.540317 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.540326 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.540339 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.540347 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:19Z","lastTransitionTime":"2025-12-06T15:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.642110 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.642154 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.642163 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.642177 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.642186 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:19Z","lastTransitionTime":"2025-12-06T15:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.746181 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.746236 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.746254 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.746278 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.746296 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:19Z","lastTransitionTime":"2025-12-06T15:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.848956 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.849078 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.849112 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.849144 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.849161 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:19Z","lastTransitionTime":"2025-12-06T15:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.951473 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.951510 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.951522 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.951539 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:19 crc kubenswrapper[4848]: I1206 15:29:19.951553 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:19Z","lastTransitionTime":"2025-12-06T15:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.054883 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.054951 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.054967 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.054989 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.055014 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:20Z","lastTransitionTime":"2025-12-06T15:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.158525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.158616 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.158643 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.158677 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.158761 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:20Z","lastTransitionTime":"2025-12-06T15:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.211370 4848 generic.go:334] "Generic (PLEG): container finished" podID="a6dad9b9-172f-494c-adb1-da5c45b89ebd" containerID="e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f" exitCode=0 Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.211475 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" event={"ID":"a6dad9b9-172f-494c-adb1-da5c45b89ebd","Type":"ContainerDied","Data":"e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f"} Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.211750 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.227855 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.242839 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.256369 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.265932 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.265981 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.265992 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.266013 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.266026 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:20Z","lastTransitionTime":"2025-12-06T15:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.268849 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.297010 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.309339 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.321453 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.332660 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.344612 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.358569 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.368887 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.368920 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.368929 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.368944 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.368954 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:20Z","lastTransitionTime":"2025-12-06T15:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.372411 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.386721 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.401425 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.412976 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:20Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.471190 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.471222 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.471231 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.471244 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.471253 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:20Z","lastTransitionTime":"2025-12-06T15:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.574357 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.574386 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.574395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.574406 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.574416 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:20Z","lastTransitionTime":"2025-12-06T15:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.677729 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.677804 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.677820 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.677843 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.677861 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:20Z","lastTransitionTime":"2025-12-06T15:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.781763 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.781823 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.781840 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.781865 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.781883 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:20Z","lastTransitionTime":"2025-12-06T15:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.885526 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.885606 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.885624 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.885649 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.885666 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:20Z","lastTransitionTime":"2025-12-06T15:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.966170 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.966170 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.966301 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:20 crc kubenswrapper[4848]: E1206 15:29:20.966439 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:20 crc kubenswrapper[4848]: E1206 15:29:20.966867 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:20 crc kubenswrapper[4848]: E1206 15:29:20.966943 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.988941 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.988977 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.989023 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.989040 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:20 crc kubenswrapper[4848]: I1206 15:29:20.989052 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:20Z","lastTransitionTime":"2025-12-06T15:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.092398 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.092448 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.092462 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.092481 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.092495 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:21Z","lastTransitionTime":"2025-12-06T15:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.195875 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.195940 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.195963 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.195994 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.196020 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:21Z","lastTransitionTime":"2025-12-06T15:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.227429 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" event={"ID":"a6dad9b9-172f-494c-adb1-da5c45b89ebd","Type":"ContainerStarted","Data":"21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9"} Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.227568 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.255912 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.278458 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.299900 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.299980 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.300130 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.300173 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.300198 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:21Z","lastTransitionTime":"2025-12-06T15:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.310090 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.335073 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.351294 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.370564 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.388552 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.403048 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.403096 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.403111 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.403131 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.403145 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:21Z","lastTransitionTime":"2025-12-06T15:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.409287 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.425326 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.438693 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.449653 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.460827 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.471433 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.483964 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:21Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.505973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.506016 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.506025 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.506051 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.506062 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:21Z","lastTransitionTime":"2025-12-06T15:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.607963 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.608015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.608030 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.608047 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.608064 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:21Z","lastTransitionTime":"2025-12-06T15:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.711217 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.711266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.711276 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.711292 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.711304 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:21Z","lastTransitionTime":"2025-12-06T15:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.814767 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.815157 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.815175 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.815199 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.815220 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:21Z","lastTransitionTime":"2025-12-06T15:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.918066 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.918128 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.918147 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.918189 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:21 crc kubenswrapper[4848]: I1206 15:29:21.918208 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:21Z","lastTransitionTime":"2025-12-06T15:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.020224 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.020275 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.020286 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.020305 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.020319 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:22Z","lastTransitionTime":"2025-12-06T15:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.123191 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.123304 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.123329 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.123459 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.123513 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:22Z","lastTransitionTime":"2025-12-06T15:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.149327 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.173022 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.192266 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.205306 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.218243 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.226005 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.226057 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.226069 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.226083 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.226095 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:22Z","lastTransitionTime":"2025-12-06T15:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.232216 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.234057 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/0.log" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.236602 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e" exitCode=1 Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.236632 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.237934 4848 scope.go:117] "RemoveContainer" containerID="d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.259532 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.272446 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.285432 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.297875 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.309908 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.323122 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.328721 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.328744 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.328753 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.328765 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.328774 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:22Z","lastTransitionTime":"2025-12-06T15:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.332521 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.353981 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.368439 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.382816 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.430315 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.430352 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.430361 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.430375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.430404 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:22Z","lastTransitionTime":"2025-12-06T15:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.437959 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.449891 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.463242 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.475465 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.490578 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.501278 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.511713 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.522214 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.532596 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.532632 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.532642 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.532657 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.532668 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:22Z","lastTransitionTime":"2025-12-06T15:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.541222 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"message\\\":\\\"for removal\\\\nI1206 15:29:21.891789 6123 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:21.891800 6123 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:21.891838 6123 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 15:29:21.891852 6123 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 15:29:21.891916 6123 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:21.891917 6123 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 15:29:21.891920 6123 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:21.891943 6123 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:21.891959 6123 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 15:29:21.891966 6123 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:21.892021 6123 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 15:29:21.892024 6123 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:21.892028 6123 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:21.892377 6123 factory.go:656] Stopping watch factory\\\\nI1206 15:29:21.892414 6123 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.554642 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.580751 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.594110 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.616998 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.634957 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.635003 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.635014 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.635028 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.635039 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:22Z","lastTransitionTime":"2025-12-06T15:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.737745 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.737801 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.737810 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.737825 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.737837 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:22Z","lastTransitionTime":"2025-12-06T15:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.840620 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.840658 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.840668 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.840687 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.840710 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:22Z","lastTransitionTime":"2025-12-06T15:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.943271 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.943320 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.943336 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.943357 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.943375 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:22Z","lastTransitionTime":"2025-12-06T15:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.965925 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.965928 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.966083 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:22 crc kubenswrapper[4848]: E1206 15:29:22.966138 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:22 crc kubenswrapper[4848]: E1206 15:29:22.966216 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:22 crc kubenswrapper[4848]: E1206 15:29:22.966337 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.981843 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:22 crc kubenswrapper[4848]: I1206 15:29:22.992569 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.000497 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:22Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.011818 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.023884 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.034633 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.044927 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.044963 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.044973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.044989 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.044999 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:23Z","lastTransitionTime":"2025-12-06T15:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.071245 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"message\\\":\\\"for removal\\\\nI1206 15:29:21.891789 6123 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:21.891800 6123 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:21.891838 6123 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 15:29:21.891852 6123 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 15:29:21.891916 6123 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:21.891917 6123 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 15:29:21.891920 6123 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:21.891943 6123 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:21.891959 6123 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 15:29:21.891966 6123 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:21.892021 6123 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 15:29:21.892024 6123 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:21.892028 6123 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:21.892377 6123 factory.go:656] Stopping watch factory\\\\nI1206 15:29:21.892414 6123 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.089196 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.101230 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.112674 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.128973 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.146879 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.146926 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.146940 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.146964 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.146981 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:23Z","lastTransitionTime":"2025-12-06T15:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.147391 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.162056 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.173815 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.240303 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/0.log" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.243249 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d"} Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.243370 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.248428 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.248460 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.248468 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.248481 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.248490 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:23Z","lastTransitionTime":"2025-12-06T15:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.262572 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.275211 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.289729 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.308867 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"message\\\":\\\"for removal\\\\nI1206 15:29:21.891789 6123 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:21.891800 6123 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:21.891838 6123 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 15:29:21.891852 6123 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 15:29:21.891916 6123 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:21.891917 6123 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 15:29:21.891920 6123 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:21.891943 6123 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:21.891959 6123 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 15:29:21.891966 6123 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:21.892021 6123 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 15:29:21.892024 6123 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:21.892028 6123 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:21.892377 6123 factory.go:656] Stopping watch factory\\\\nI1206 15:29:21.892414 6123 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.331256 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.350058 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.350418 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.350451 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.350467 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.350486 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.350495 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:23Z","lastTransitionTime":"2025-12-06T15:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.364026 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.385211 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.404403 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.422206 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.442281 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.452923 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.452953 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.452961 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.452973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.452983 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:23Z","lastTransitionTime":"2025-12-06T15:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.505041 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.522952 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.536616 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:23Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.555797 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.555872 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.555891 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.555927 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.555951 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:23Z","lastTransitionTime":"2025-12-06T15:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.658677 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.658783 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.658807 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.658840 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.658864 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:23Z","lastTransitionTime":"2025-12-06T15:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.762463 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.762511 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.762524 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.762541 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.762550 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:23Z","lastTransitionTime":"2025-12-06T15:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.866447 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.866521 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.866539 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.866572 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.866595 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:23Z","lastTransitionTime":"2025-12-06T15:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.969498 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.969546 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.969558 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.969578 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:23 crc kubenswrapper[4848]: I1206 15:29:23.969588 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:23Z","lastTransitionTime":"2025-12-06T15:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.072681 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.072732 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.072741 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.072754 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.072764 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.176014 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.176070 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.176083 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.176104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.176201 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.249665 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/1.log" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.250527 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/0.log" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.254214 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d" exitCode=1 Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.254280 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.254409 4848 scope.go:117] "RemoveContainer" containerID="d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.256053 4848 scope.go:117] "RemoveContainer" containerID="6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d" Dec 06 15:29:24 crc kubenswrapper[4848]: E1206 15:29:24.256638 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.275756 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.281142 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.281214 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.281233 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.281260 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.281276 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.295486 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.310292 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.325328 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.344331 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.361999 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.376580 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.383467 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.383509 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.383520 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.383535 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.383544 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.392834 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.402753 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.421482 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.441415 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.458901 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.478401 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.478487 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.478535 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.478566 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.478585 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.502506 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"message\\\":\\\"for removal\\\\nI1206 15:29:21.891789 6123 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:21.891800 6123 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:21.891838 6123 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 15:29:21.891852 6123 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 15:29:21.891916 6123 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:21.891917 6123 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 15:29:21.891920 6123 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:21.891943 6123 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:21.891959 6123 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 15:29:21.891966 6123 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:21.892021 6123 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 15:29:21.892024 6123 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:21.892028 6123 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:21.892377 6123 factory.go:656] Stopping watch factory\\\\nI1206 15:29:21.892414 6123 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:23Z\\\",\\\"message\\\":\\\"alse, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"be9dcc9e-c16a-4962-a6d2-4adeb0b929c4\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Swi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: E1206 15:29:24.506908 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.516202 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.516290 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.516334 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.516357 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.516373 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: E1206 15:29:24.534116 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.539395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.539463 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.539480 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.539504 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.539520 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.547579 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: E1206 15:29:24.563111 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.566877 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.566905 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.566913 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.566927 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.566936 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: E1206 15:29:24.580005 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.583774 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.584018 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.584096 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.584169 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.584236 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: E1206 15:29:24.598107 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:24Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:24 crc kubenswrapper[4848]: E1206 15:29:24.598307 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.600309 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.600410 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.600484 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.600574 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.600641 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.704359 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.704743 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.704830 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.704903 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.704967 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.808599 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.809220 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.809414 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.809589 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.809794 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.913669 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.913781 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.913808 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.913844 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.913872 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:24Z","lastTransitionTime":"2025-12-06T15:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.966014 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.966168 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:24 crc kubenswrapper[4848]: E1206 15:29:24.966405 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:24 crc kubenswrapper[4848]: E1206 15:29:24.966177 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:24 crc kubenswrapper[4848]: I1206 15:29:24.966682 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:24 crc kubenswrapper[4848]: E1206 15:29:24.966876 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.017460 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.017998 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.018262 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.018486 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.018748 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:25Z","lastTransitionTime":"2025-12-06T15:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.122301 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.122342 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.122354 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.122370 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.122381 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:25Z","lastTransitionTime":"2025-12-06T15:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.225747 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.225795 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.225804 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.225820 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.225835 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:25Z","lastTransitionTime":"2025-12-06T15:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.260733 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/1.log" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.329386 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.329427 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.329436 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.329456 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.329467 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:25Z","lastTransitionTime":"2025-12-06T15:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.414851 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk"] Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.415691 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.420135 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.420165 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.434103 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.437074 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.437126 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.437174 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.437202 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.437220 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:25Z","lastTransitionTime":"2025-12-06T15:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.450033 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.460509 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b907644-0c15-4494-a36e-b97960b3ab69-env-overrides\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.460652 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b907644-0c15-4494-a36e-b97960b3ab69-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.460760 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b907644-0c15-4494-a36e-b97960b3ab69-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.460913 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t5bc\" (UniqueName: \"kubernetes.io/projected/7b907644-0c15-4494-a36e-b97960b3ab69-kube-api-access-7t5bc\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.469664 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.486365 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.501305 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.515468 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.540561 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.540614 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.540626 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.540647 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.540662 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:25Z","lastTransitionTime":"2025-12-06T15:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.545927 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"message\\\":\\\"for removal\\\\nI1206 15:29:21.891789 6123 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:21.891800 6123 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:21.891838 6123 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 15:29:21.891852 6123 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 15:29:21.891916 6123 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:21.891917 6123 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 15:29:21.891920 6123 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:21.891943 6123 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:21.891959 6123 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 15:29:21.891966 6123 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:21.892021 6123 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 15:29:21.892024 6123 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:21.892028 6123 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:21.892377 6123 factory.go:656] Stopping watch factory\\\\nI1206 15:29:21.892414 6123 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:23Z\\\",\\\"message\\\":\\\"alse, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"be9dcc9e-c16a-4962-a6d2-4adeb0b929c4\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Swi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.562157 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b907644-0c15-4494-a36e-b97960b3ab69-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.562265 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t5bc\" (UniqueName: \"kubernetes.io/projected/7b907644-0c15-4494-a36e-b97960b3ab69-kube-api-access-7t5bc\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.562323 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b907644-0c15-4494-a36e-b97960b3ab69-env-overrides\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.562355 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b907644-0c15-4494-a36e-b97960b3ab69-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.564205 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b907644-0c15-4494-a36e-b97960b3ab69-env-overrides\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.565588 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b907644-0c15-4494-a36e-b97960b3ab69-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.570682 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b907644-0c15-4494-a36e-b97960b3ab69-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.572967 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.592283 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t5bc\" (UniqueName: \"kubernetes.io/projected/7b907644-0c15-4494-a36e-b97960b3ab69-kube-api-access-7t5bc\") pod \"ovnkube-control-plane-749d76644c-96ljk\" (UID: \"7b907644-0c15-4494-a36e-b97960b3ab69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.594014 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.610917 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.626426 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.642684 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.643119 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.643160 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.643172 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.643190 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.643202 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:25Z","lastTransitionTime":"2025-12-06T15:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.660503 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.673848 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.694218 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:25Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.732596 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.747477 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.747953 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.747980 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.748018 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.748044 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:25Z","lastTransitionTime":"2025-12-06T15:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:25 crc kubenswrapper[4848]: W1206 15:29:25.761930 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b907644_0c15_4494_a36e_b97960b3ab69.slice/crio-feea33c526d011c045286170c89c2f5bdd0695619fc686f207675fc311826f04 WatchSource:0}: Error finding container feea33c526d011c045286170c89c2f5bdd0695619fc686f207675fc311826f04: Status 404 returned error can't find the container with id feea33c526d011c045286170c89c2f5bdd0695619fc686f207675fc311826f04 Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.851088 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.851152 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.851172 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.851198 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.851218 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:25Z","lastTransitionTime":"2025-12-06T15:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.953741 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.953785 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.953796 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.953812 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:25 crc kubenswrapper[4848]: I1206 15:29:25.953822 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:25Z","lastTransitionTime":"2025-12-06T15:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.057379 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.057432 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.057442 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.057459 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.057474 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:26Z","lastTransitionTime":"2025-12-06T15:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.163788 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.163838 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.163853 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.163876 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.163891 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:26Z","lastTransitionTime":"2025-12-06T15:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.267242 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.267283 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.267294 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.267309 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.267321 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:26Z","lastTransitionTime":"2025-12-06T15:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.269972 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" event={"ID":"7b907644-0c15-4494-a36e-b97960b3ab69","Type":"ContainerStarted","Data":"feea33c526d011c045286170c89c2f5bdd0695619fc686f207675fc311826f04"} Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.370346 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.370433 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.370453 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.370487 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.370507 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:26Z","lastTransitionTime":"2025-12-06T15:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.473077 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.473149 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.473167 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.473194 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.473212 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:26Z","lastTransitionTime":"2025-12-06T15:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.539365 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-v4dm4"] Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.539993 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.540081 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.562890 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.572607 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ds2c\" (UniqueName: \"kubernetes.io/projected/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-kube-api-access-6ds2c\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.572824 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.576355 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.576389 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.576398 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.576428 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.576437 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:26Z","lastTransitionTime":"2025-12-06T15:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.595528 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.615934 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.638506 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.660059 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.673870 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.673868 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.674034 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:29:42.674009888 +0000 UTC m=+49.972020821 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.674639 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.674848 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.674909 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs podName:0f6acd83-a70e-4a34-96a5-ea7bd9e95935 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:27.174896755 +0000 UTC m=+34.472907678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs") pod "network-metrics-daemon-v4dm4" (UID: "0f6acd83-a70e-4a34-96a5-ea7bd9e95935") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.674850 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.675200 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.675268 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.675554 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:42.67540765 +0000 UTC m=+49.973418593 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.675554 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.675741 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ds2c\" (UniqueName: \"kubernetes.io/projected/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-kube-api-access-6ds2c\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.675929 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:42.675869624 +0000 UTC m=+49.973880577 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.679855 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.680062 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.680214 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.680400 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.680553 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:26Z","lastTransitionTime":"2025-12-06T15:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.692098 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.707078 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.708684 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ds2c\" (UniqueName: \"kubernetes.io/projected/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-kube-api-access-6ds2c\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.729304 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.748001 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.761961 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.777195 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.777349 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.777572 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.777599 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.777636 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.777653 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.777661 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.777679 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.777800 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:42.777769391 +0000 UTC m=+50.075780344 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.777837 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:42.777822052 +0000 UTC m=+50.075833005 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.782855 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.784054 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.784121 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.784142 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.784170 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.784198 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:26Z","lastTransitionTime":"2025-12-06T15:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.804320 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.828074 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"message\\\":\\\"for removal\\\\nI1206 15:29:21.891789 6123 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:21.891800 6123 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:21.891838 6123 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 15:29:21.891852 6123 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 15:29:21.891916 6123 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:21.891917 6123 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 15:29:21.891920 6123 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:21.891943 6123 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:21.891959 6123 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 15:29:21.891966 6123 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:21.892021 6123 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 15:29:21.892024 6123 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:21.892028 6123 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:21.892377 6123 factory.go:656] Stopping watch factory\\\\nI1206 15:29:21.892414 6123 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:23Z\\\",\\\"message\\\":\\\"alse, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"be9dcc9e-c16a-4962-a6d2-4adeb0b929c4\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Swi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.848119 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.868728 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:26Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.888050 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.888116 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.888139 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.888171 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.888195 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:26Z","lastTransitionTime":"2025-12-06T15:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.966405 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.966476 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.966555 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.966583 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.966645 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:26 crc kubenswrapper[4848]: E1206 15:29:26.967251 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.990825 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.990875 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.990891 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.990912 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:26 crc kubenswrapper[4848]: I1206 15:29:26.990926 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:26Z","lastTransitionTime":"2025-12-06T15:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.093025 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.093054 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.093063 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.093075 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.093084 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:27Z","lastTransitionTime":"2025-12-06T15:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.181782 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:27 crc kubenswrapper[4848]: E1206 15:29:27.181900 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:27 crc kubenswrapper[4848]: E1206 15:29:27.182060 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs podName:0f6acd83-a70e-4a34-96a5-ea7bd9e95935 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:28.182044919 +0000 UTC m=+35.480055832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs") pod "network-metrics-daemon-v4dm4" (UID: "0f6acd83-a70e-4a34-96a5-ea7bd9e95935") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.195408 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.195447 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.195459 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.195485 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.195497 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:27Z","lastTransitionTime":"2025-12-06T15:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.275504 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" event={"ID":"7b907644-0c15-4494-a36e-b97960b3ab69","Type":"ContainerStarted","Data":"fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.275553 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" event={"ID":"7b907644-0c15-4494-a36e-b97960b3ab69","Type":"ContainerStarted","Data":"12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.293566 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.297944 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.297981 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.297990 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.298004 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.298013 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:27Z","lastTransitionTime":"2025-12-06T15:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.310199 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.322813 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.334496 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.351639 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.367179 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.381046 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.393825 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.400926 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.401469 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.401593 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.401773 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.401909 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:27Z","lastTransitionTime":"2025-12-06T15:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.406452 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.420227 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.431428 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.444597 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.456551 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.466411 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.486295 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4aefc01383e69cea17cde42ded5c62fd6af83958c69c14fcb1e686102f3626e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"message\\\":\\\"for removal\\\\nI1206 15:29:21.891789 6123 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:21.891800 6123 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:21.891838 6123 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 15:29:21.891852 6123 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 15:29:21.891916 6123 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:21.891917 6123 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 15:29:21.891920 6123 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:21.891943 6123 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:21.891959 6123 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 15:29:21.891966 6123 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:21.892021 6123 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 15:29:21.892024 6123 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:21.892028 6123 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:21.892377 6123 factory.go:656] Stopping watch factory\\\\nI1206 15:29:21.892414 6123 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:23Z\\\",\\\"message\\\":\\\"alse, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"be9dcc9e-c16a-4962-a6d2-4adeb0b929c4\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Swi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.501583 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:27Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.503974 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.504004 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.504015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.504030 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.504041 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:27Z","lastTransitionTime":"2025-12-06T15:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.606255 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.606352 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.606377 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.606404 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.606426 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:27Z","lastTransitionTime":"2025-12-06T15:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.709958 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.710027 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.710087 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.710114 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.710140 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:27Z","lastTransitionTime":"2025-12-06T15:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.813014 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.813057 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.813069 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.813084 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.813095 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:27Z","lastTransitionTime":"2025-12-06T15:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.915798 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.915834 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.915846 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.915860 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.915871 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:27Z","lastTransitionTime":"2025-12-06T15:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:27 crc kubenswrapper[4848]: I1206 15:29:27.966332 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:27 crc kubenswrapper[4848]: E1206 15:29:27.966441 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.017486 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.017753 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.017829 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.017904 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.017962 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:28Z","lastTransitionTime":"2025-12-06T15:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.121263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.121344 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.121363 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.121393 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.121413 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:28Z","lastTransitionTime":"2025-12-06T15:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.191631 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:28 crc kubenswrapper[4848]: E1206 15:29:28.191959 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:28 crc kubenswrapper[4848]: E1206 15:29:28.192421 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs podName:0f6acd83-a70e-4a34-96a5-ea7bd9e95935 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:30.192390289 +0000 UTC m=+37.490401382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs") pod "network-metrics-daemon-v4dm4" (UID: "0f6acd83-a70e-4a34-96a5-ea7bd9e95935") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.224798 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.224843 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.224854 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.224870 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.224886 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:28Z","lastTransitionTime":"2025-12-06T15:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.327450 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.327497 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.327509 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.327527 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.327538 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:28Z","lastTransitionTime":"2025-12-06T15:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.430092 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.430332 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.430426 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.430495 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.430558 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:28Z","lastTransitionTime":"2025-12-06T15:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.534137 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.534426 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.534490 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.534555 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.534609 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:28Z","lastTransitionTime":"2025-12-06T15:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.639450 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.639512 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.639537 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.639572 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.639598 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:28Z","lastTransitionTime":"2025-12-06T15:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.741670 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.741726 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.741738 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.741753 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.741764 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:28Z","lastTransitionTime":"2025-12-06T15:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.844350 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.844783 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.844919 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.845071 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.845191 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:28Z","lastTransitionTime":"2025-12-06T15:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.948985 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.949069 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.949091 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.949120 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.949142 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:28Z","lastTransitionTime":"2025-12-06T15:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.966502 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.966540 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:28 crc kubenswrapper[4848]: E1206 15:29:28.966687 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:28 crc kubenswrapper[4848]: I1206 15:29:28.966879 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:28 crc kubenswrapper[4848]: E1206 15:29:28.966977 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:28 crc kubenswrapper[4848]: E1206 15:29:28.967076 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.051936 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.052005 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.052031 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.052075 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.052102 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:29Z","lastTransitionTime":"2025-12-06T15:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.154654 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.154936 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.155023 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.155154 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.155250 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:29Z","lastTransitionTime":"2025-12-06T15:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.257806 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.257866 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.257875 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.257892 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.257903 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:29Z","lastTransitionTime":"2025-12-06T15:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.360412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.360460 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.360474 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.360491 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.360503 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:29Z","lastTransitionTime":"2025-12-06T15:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.463389 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.463901 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.464048 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.464167 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.464242 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:29Z","lastTransitionTime":"2025-12-06T15:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.567475 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.567522 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.567537 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.567555 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.567569 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:29Z","lastTransitionTime":"2025-12-06T15:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.671372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.671441 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.671461 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.671487 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.671511 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:29Z","lastTransitionTime":"2025-12-06T15:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.775630 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.775787 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.775812 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.775848 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.775872 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:29Z","lastTransitionTime":"2025-12-06T15:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.879358 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.880030 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.880196 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.880388 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.880602 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:29Z","lastTransitionTime":"2025-12-06T15:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.966244 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:29 crc kubenswrapper[4848]: E1206 15:29:29.966397 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.984249 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.984322 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.984340 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.984372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:29 crc kubenswrapper[4848]: I1206 15:29:29.984393 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:29Z","lastTransitionTime":"2025-12-06T15:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.088580 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.088663 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.088688 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.088758 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.088787 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:30Z","lastTransitionTime":"2025-12-06T15:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.193233 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.193313 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.193336 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.193368 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.193390 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:30Z","lastTransitionTime":"2025-12-06T15:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.217213 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:30 crc kubenswrapper[4848]: E1206 15:29:30.217438 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:30 crc kubenswrapper[4848]: E1206 15:29:30.217534 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs podName:0f6acd83-a70e-4a34-96a5-ea7bd9e95935 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:34.217506301 +0000 UTC m=+41.515517244 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs") pod "network-metrics-daemon-v4dm4" (UID: "0f6acd83-a70e-4a34-96a5-ea7bd9e95935") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.297026 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.297469 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.298343 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.298668 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.298972 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:30Z","lastTransitionTime":"2025-12-06T15:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.404003 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.405141 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.405491 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.405661 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.405923 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:30Z","lastTransitionTime":"2025-12-06T15:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.509175 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.509251 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.509274 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.509311 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.509340 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:30Z","lastTransitionTime":"2025-12-06T15:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.614039 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.614116 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.614135 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.614169 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.614193 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:30Z","lastTransitionTime":"2025-12-06T15:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.717522 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.717584 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.717599 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.717621 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.717640 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:30Z","lastTransitionTime":"2025-12-06T15:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.821058 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.821094 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.821104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.821119 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.821127 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:30Z","lastTransitionTime":"2025-12-06T15:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.924732 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.924868 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.924888 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.925838 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.925969 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:30Z","lastTransitionTime":"2025-12-06T15:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.966375 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.966552 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:30 crc kubenswrapper[4848]: E1206 15:29:30.966627 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:30 crc kubenswrapper[4848]: I1206 15:29:30.966788 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:30 crc kubenswrapper[4848]: E1206 15:29:30.966947 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:30 crc kubenswrapper[4848]: E1206 15:29:30.967111 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.031479 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.031524 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.031536 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.031558 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.031577 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:31Z","lastTransitionTime":"2025-12-06T15:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.135001 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.135065 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.135081 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.135103 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.135119 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:31Z","lastTransitionTime":"2025-12-06T15:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.236713 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.236747 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.236760 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.236774 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.236786 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:31Z","lastTransitionTime":"2025-12-06T15:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.339820 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.339883 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.339902 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.339928 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.339949 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:31Z","lastTransitionTime":"2025-12-06T15:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.442342 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.442382 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.442393 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.442410 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.442423 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:31Z","lastTransitionTime":"2025-12-06T15:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.544941 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.544973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.544981 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.544993 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.545005 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:31Z","lastTransitionTime":"2025-12-06T15:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.647056 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.647100 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.647110 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.647121 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.647130 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:31Z","lastTransitionTime":"2025-12-06T15:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.750133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.750248 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.750307 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.750350 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.750422 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:31Z","lastTransitionTime":"2025-12-06T15:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.770299 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.771067 4848 scope.go:117] "RemoveContainer" containerID="6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d" Dec 06 15:29:31 crc kubenswrapper[4848]: E1206 15:29:31.771264 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.785076 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.803370 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.815640 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.829988 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.843944 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.854593 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.854832 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.854974 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.855066 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.855139 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:31Z","lastTransitionTime":"2025-12-06T15:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.857392 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.872189 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.884800 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.897161 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.909101 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.920856 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.937992 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:23Z\\\",\\\"message\\\":\\\"alse, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"be9dcc9e-c16a-4962-a6d2-4adeb0b929c4\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Swi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.951031 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.957960 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.957989 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.957998 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.958013 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.958023 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:31Z","lastTransitionTime":"2025-12-06T15:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.962029 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.965921 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:31 crc kubenswrapper[4848]: E1206 15:29:31.966046 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.972575 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:31 crc kubenswrapper[4848]: I1206 15:29:31.981376 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:31Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.060081 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.060133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.060148 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.060169 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.060184 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:32Z","lastTransitionTime":"2025-12-06T15:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.162398 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.162451 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.162465 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.162484 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.162496 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:32Z","lastTransitionTime":"2025-12-06T15:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.264333 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.264400 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.264412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.264431 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.264460 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:32Z","lastTransitionTime":"2025-12-06T15:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.367438 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.367475 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.367486 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.367500 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.367511 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:32Z","lastTransitionTime":"2025-12-06T15:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.470327 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.470368 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.470380 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.470395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.470405 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:32Z","lastTransitionTime":"2025-12-06T15:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.573149 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.573181 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.573209 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.573223 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.573234 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:32Z","lastTransitionTime":"2025-12-06T15:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.675657 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.675697 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.675731 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.675748 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.675759 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:32Z","lastTransitionTime":"2025-12-06T15:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.778146 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.778191 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.778206 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.778224 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.778239 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:32Z","lastTransitionTime":"2025-12-06T15:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.881042 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.881099 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.881116 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.881142 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.881157 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:32Z","lastTransitionTime":"2025-12-06T15:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.965798 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:32 crc kubenswrapper[4848]: E1206 15:29:32.965926 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.965806 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.966026 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:32 crc kubenswrapper[4848]: E1206 15:29:32.966114 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:32 crc kubenswrapper[4848]: E1206 15:29:32.966282 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.985549 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:32Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.987693 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.987769 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.987787 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.987811 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.987828 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:32Z","lastTransitionTime":"2025-12-06T15:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:32 crc kubenswrapper[4848]: I1206 15:29:32.999515 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:32Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.012832 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.024286 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.040756 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.056718 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.069451 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.084384 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.091304 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.091375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.091396 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.091425 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.091447 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:33Z","lastTransitionTime":"2025-12-06T15:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.096103 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.109107 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.123147 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.139542 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.154025 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.171689 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.189169 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.194324 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.194368 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.194385 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.194412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.194429 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:33Z","lastTransitionTime":"2025-12-06T15:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.209823 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:23Z\\\",\\\"message\\\":\\\"alse, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"be9dcc9e-c16a-4962-a6d2-4adeb0b929c4\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Swi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:33Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.296768 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.296836 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.296854 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.296878 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.296897 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:33Z","lastTransitionTime":"2025-12-06T15:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.400016 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.400072 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.400090 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.400179 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.400207 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:33Z","lastTransitionTime":"2025-12-06T15:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.502885 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.502915 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.502924 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.502936 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.502945 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:33Z","lastTransitionTime":"2025-12-06T15:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.605395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.605584 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.605619 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.605647 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.605668 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:33Z","lastTransitionTime":"2025-12-06T15:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.708267 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.708352 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.708377 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.708410 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.708434 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:33Z","lastTransitionTime":"2025-12-06T15:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.811849 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.811933 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.811952 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.811970 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.811983 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:33Z","lastTransitionTime":"2025-12-06T15:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.914605 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.914667 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.914679 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.914731 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.914779 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:33Z","lastTransitionTime":"2025-12-06T15:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:33 crc kubenswrapper[4848]: I1206 15:29:33.966306 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:33 crc kubenswrapper[4848]: E1206 15:29:33.966516 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.017018 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.017060 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.017071 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.017087 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.017097 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.119432 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.119476 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.119491 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.119511 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.119526 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.221949 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.222025 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.222051 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.222065 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.222074 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.255965 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:34 crc kubenswrapper[4848]: E1206 15:29:34.256152 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:34 crc kubenswrapper[4848]: E1206 15:29:34.256261 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs podName:0f6acd83-a70e-4a34-96a5-ea7bd9e95935 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:42.256234211 +0000 UTC m=+49.554245184 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs") pod "network-metrics-daemon-v4dm4" (UID: "0f6acd83-a70e-4a34-96a5-ea7bd9e95935") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.324348 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.324396 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.324411 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.324431 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.324505 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.427266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.427300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.427312 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.427327 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.427343 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.529662 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.529740 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.529751 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.529770 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.529803 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.632363 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.632412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.632424 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.632441 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.632454 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.735321 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.735361 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.735372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.735388 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.735400 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.837996 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.838067 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.838080 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.838098 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.838109 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.922194 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.922243 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.922254 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.922271 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.922287 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: E1206 15:29:34.934786 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:34Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.939927 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.939978 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.939992 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.940009 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.940021 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: E1206 15:29:34.954368 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:34Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.958620 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.958660 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.958682 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.958720 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.958738 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.965414 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.965471 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.965471 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:34 crc kubenswrapper[4848]: E1206 15:29:34.965552 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:34 crc kubenswrapper[4848]: E1206 15:29:34.965737 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:34 crc kubenswrapper[4848]: E1206 15:29:34.965878 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:34 crc kubenswrapper[4848]: E1206 15:29:34.973809 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:34Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.978501 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.978542 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.978554 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.978571 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.978584 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:34 crc kubenswrapper[4848]: E1206 15:29:34.989952 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:34Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.993781 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.993830 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.993844 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.993862 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:34 crc kubenswrapper[4848]: I1206 15:29:34.993877 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:34Z","lastTransitionTime":"2025-12-06T15:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:35 crc kubenswrapper[4848]: E1206 15:29:35.007068 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:35Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:35 crc kubenswrapper[4848]: E1206 15:29:35.007250 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.008773 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.008809 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.008833 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.008854 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.008867 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:35Z","lastTransitionTime":"2025-12-06T15:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.111509 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.111565 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.111578 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.111593 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.111602 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:35Z","lastTransitionTime":"2025-12-06T15:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.213892 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.213936 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.213957 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.213975 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.213986 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:35Z","lastTransitionTime":"2025-12-06T15:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.316471 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.316582 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.316623 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.316637 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.316646 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:35Z","lastTransitionTime":"2025-12-06T15:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.419073 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.419136 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.419152 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.419172 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.419191 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:35Z","lastTransitionTime":"2025-12-06T15:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.521962 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.522018 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.522027 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.522040 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.522048 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:35Z","lastTransitionTime":"2025-12-06T15:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.625255 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.625290 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.625303 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.625319 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.625331 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:35Z","lastTransitionTime":"2025-12-06T15:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.728203 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.728244 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.728255 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.728273 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.728285 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:35Z","lastTransitionTime":"2025-12-06T15:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.830965 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.831023 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.831039 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.831062 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.831078 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:35Z","lastTransitionTime":"2025-12-06T15:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.933051 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.933122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.933148 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.933193 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.933215 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:35Z","lastTransitionTime":"2025-12-06T15:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:35 crc kubenswrapper[4848]: I1206 15:29:35.965827 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:35 crc kubenswrapper[4848]: E1206 15:29:35.966249 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.035638 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.035723 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.035742 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.035764 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.035778 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:36Z","lastTransitionTime":"2025-12-06T15:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.138326 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.138380 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.138398 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.138421 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.138440 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:36Z","lastTransitionTime":"2025-12-06T15:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.241365 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.241615 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.241679 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.241785 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.241851 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:36Z","lastTransitionTime":"2025-12-06T15:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.344841 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.345237 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.345410 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.345577 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.345827 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:36Z","lastTransitionTime":"2025-12-06T15:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.448144 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.448201 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.448222 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.448247 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.448264 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:36Z","lastTransitionTime":"2025-12-06T15:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.551318 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.551361 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.551372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.551390 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.551404 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:36Z","lastTransitionTime":"2025-12-06T15:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.653684 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.654370 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.654465 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.654559 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.654651 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:36Z","lastTransitionTime":"2025-12-06T15:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.757072 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.757104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.757114 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.757127 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.757141 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:36Z","lastTransitionTime":"2025-12-06T15:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.859332 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.859375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.859386 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.859402 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.859413 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:36Z","lastTransitionTime":"2025-12-06T15:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.962159 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.962217 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.962227 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.962249 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.962260 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:36Z","lastTransitionTime":"2025-12-06T15:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.965815 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.965818 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:36 crc kubenswrapper[4848]: I1206 15:29:36.965922 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:36 crc kubenswrapper[4848]: E1206 15:29:36.966047 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:36 crc kubenswrapper[4848]: E1206 15:29:36.966144 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:36 crc kubenswrapper[4848]: E1206 15:29:36.966246 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.064874 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.064933 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.064947 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.064967 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.064979 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:37Z","lastTransitionTime":"2025-12-06T15:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.168072 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.168153 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.168171 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.168199 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.168218 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:37Z","lastTransitionTime":"2025-12-06T15:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.270949 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.271036 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.271061 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.271096 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.271124 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:37Z","lastTransitionTime":"2025-12-06T15:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.374140 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.374212 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.374225 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.374249 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.374264 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:37Z","lastTransitionTime":"2025-12-06T15:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.476688 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.476850 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.476878 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.476921 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.476951 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:37Z","lastTransitionTime":"2025-12-06T15:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.580344 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.580399 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.580410 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.580432 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.580448 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:37Z","lastTransitionTime":"2025-12-06T15:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.684108 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.684197 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.684222 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.684256 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.684281 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:37Z","lastTransitionTime":"2025-12-06T15:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.787518 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.787569 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.787583 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.787602 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.787614 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:37Z","lastTransitionTime":"2025-12-06T15:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.891135 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.891189 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.891200 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.891221 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.891231 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:37Z","lastTransitionTime":"2025-12-06T15:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.965875 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:37 crc kubenswrapper[4848]: E1206 15:29:37.966022 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.993987 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.994028 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.994039 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.994059 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:37 crc kubenswrapper[4848]: I1206 15:29:37.994075 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:37Z","lastTransitionTime":"2025-12-06T15:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.097088 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.097128 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.097136 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.097150 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.097160 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:38Z","lastTransitionTime":"2025-12-06T15:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.199809 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.199880 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.199901 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.199931 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.199952 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:38Z","lastTransitionTime":"2025-12-06T15:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.302979 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.303018 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.303027 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.303039 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.303048 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:38Z","lastTransitionTime":"2025-12-06T15:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.406457 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.406509 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.406552 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.406571 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.406586 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:38Z","lastTransitionTime":"2025-12-06T15:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.508606 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.508641 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.508650 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.508663 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.508671 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:38Z","lastTransitionTime":"2025-12-06T15:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.611236 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.611293 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.611305 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.611326 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.611338 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:38Z","lastTransitionTime":"2025-12-06T15:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.714038 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.714102 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.714120 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.714143 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.714160 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:38Z","lastTransitionTime":"2025-12-06T15:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.816534 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.816592 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.816611 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.816631 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.816643 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:38Z","lastTransitionTime":"2025-12-06T15:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.919647 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.919769 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.919793 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.919826 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.919846 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:38Z","lastTransitionTime":"2025-12-06T15:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.966172 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.966272 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:38 crc kubenswrapper[4848]: I1206 15:29:38.966346 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:38 crc kubenswrapper[4848]: E1206 15:29:38.966414 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:38 crc kubenswrapper[4848]: E1206 15:29:38.966509 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:38 crc kubenswrapper[4848]: E1206 15:29:38.966574 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.023065 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.023158 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.023177 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.023201 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.023218 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:39Z","lastTransitionTime":"2025-12-06T15:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.126430 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.126512 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.126533 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.126566 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.126589 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:39Z","lastTransitionTime":"2025-12-06T15:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.230193 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.230240 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.230250 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.230266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.230278 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:39Z","lastTransitionTime":"2025-12-06T15:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.332987 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.333022 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.333030 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.333044 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.333054 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:39Z","lastTransitionTime":"2025-12-06T15:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.435186 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.435226 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.435236 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.435251 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.435260 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:39Z","lastTransitionTime":"2025-12-06T15:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.537889 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.537930 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.537939 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.537956 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.537966 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:39Z","lastTransitionTime":"2025-12-06T15:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.640591 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.640641 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.640650 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.640667 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.640677 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:39Z","lastTransitionTime":"2025-12-06T15:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.744183 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.744242 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.744253 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.744268 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.744279 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:39Z","lastTransitionTime":"2025-12-06T15:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.847584 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.847684 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.847757 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.847809 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.847839 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:39Z","lastTransitionTime":"2025-12-06T15:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.950601 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.950670 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.950719 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.950749 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.950772 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:39Z","lastTransitionTime":"2025-12-06T15:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:39 crc kubenswrapper[4848]: I1206 15:29:39.965860 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:39 crc kubenswrapper[4848]: E1206 15:29:39.965986 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.053617 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.053665 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.053674 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.053707 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.053738 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:40Z","lastTransitionTime":"2025-12-06T15:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.156525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.156566 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.156578 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.156595 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.156606 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:40Z","lastTransitionTime":"2025-12-06T15:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.259969 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.260009 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.260026 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.260043 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.260056 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:40Z","lastTransitionTime":"2025-12-06T15:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.363068 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.363160 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.363197 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.363225 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.363247 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:40Z","lastTransitionTime":"2025-12-06T15:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.465802 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.465846 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.465858 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.465880 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.465893 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:40Z","lastTransitionTime":"2025-12-06T15:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.569357 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.569422 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.569434 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.569454 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.569465 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:40Z","lastTransitionTime":"2025-12-06T15:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.671858 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.671902 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.671910 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.671925 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.671934 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:40Z","lastTransitionTime":"2025-12-06T15:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.774768 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.774808 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.774816 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.774829 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.774838 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:40Z","lastTransitionTime":"2025-12-06T15:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.877421 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.877448 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.877456 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.877469 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.877478 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:40Z","lastTransitionTime":"2025-12-06T15:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.965357 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.965373 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.965425 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:40 crc kubenswrapper[4848]: E1206 15:29:40.965468 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:40 crc kubenswrapper[4848]: E1206 15:29:40.965589 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:40 crc kubenswrapper[4848]: E1206 15:29:40.965638 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.979974 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.980031 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.980048 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.980071 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:40 crc kubenswrapper[4848]: I1206 15:29:40.980087 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:40Z","lastTransitionTime":"2025-12-06T15:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.082425 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.082459 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.082467 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.082479 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.082488 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:41Z","lastTransitionTime":"2025-12-06T15:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.185059 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.185110 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.185122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.185143 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.185155 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:41Z","lastTransitionTime":"2025-12-06T15:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.287479 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.287529 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.287539 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.287556 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.287569 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:41Z","lastTransitionTime":"2025-12-06T15:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.390490 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.390560 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.390581 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.390608 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.390630 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:41Z","lastTransitionTime":"2025-12-06T15:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.494010 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.494076 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.494095 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.494119 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.494136 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:41Z","lastTransitionTime":"2025-12-06T15:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.596543 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.596646 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.596672 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.596755 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.596783 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:41Z","lastTransitionTime":"2025-12-06T15:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.699446 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.699521 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.699544 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.699574 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.699598 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:41Z","lastTransitionTime":"2025-12-06T15:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.802439 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.802502 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.802520 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.802545 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.802564 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:41Z","lastTransitionTime":"2025-12-06T15:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.905363 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.905423 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.905439 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.905464 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.905482 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:41Z","lastTransitionTime":"2025-12-06T15:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:41 crc kubenswrapper[4848]: I1206 15:29:41.966355 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:41 crc kubenswrapper[4848]: E1206 15:29:41.966588 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.008792 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.008878 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.008906 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.008935 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.008959 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:42Z","lastTransitionTime":"2025-12-06T15:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.111852 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.111894 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.111905 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.111920 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.111933 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:42Z","lastTransitionTime":"2025-12-06T15:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.214252 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.214287 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.214302 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.214317 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.214328 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:42Z","lastTransitionTime":"2025-12-06T15:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.317388 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.317433 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.317442 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.317457 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.317466 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:42Z","lastTransitionTime":"2025-12-06T15:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.345522 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.345673 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.345783 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs podName:0f6acd83-a70e-4a34-96a5-ea7bd9e95935 nodeName:}" failed. No retries permitted until 2025-12-06 15:29:58.345764122 +0000 UTC m=+65.643775035 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs") pod "network-metrics-daemon-v4dm4" (UID: "0f6acd83-a70e-4a34-96a5-ea7bd9e95935") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.420092 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.420134 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.420147 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.420161 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.420200 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:42Z","lastTransitionTime":"2025-12-06T15:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.522289 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.522331 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.522343 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.522358 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.522370 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:42Z","lastTransitionTime":"2025-12-06T15:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.624483 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.624521 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.624530 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.624543 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.624552 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:42Z","lastTransitionTime":"2025-12-06T15:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.726941 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.726993 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.727002 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.727017 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.727027 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:42Z","lastTransitionTime":"2025-12-06T15:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.748431 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.748545 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.748613 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:14.748590427 +0000 UTC m=+82.046601340 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.748624 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.748682 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:30:14.748666539 +0000 UTC m=+82.046677472 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.748754 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.748897 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.748969 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:30:14.748957808 +0000 UTC m=+82.046968731 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.829474 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.829539 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.829561 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.830010 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.830052 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:42Z","lastTransitionTime":"2025-12-06T15:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.850015 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.850058 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.850176 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.850191 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.850201 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.850246 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 15:30:14.850234336 +0000 UTC m=+82.148245249 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.850290 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.850329 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.850343 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.850399 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 15:30:14.850381521 +0000 UTC m=+82.148392434 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.932909 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.932967 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.932980 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.933001 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.933014 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:42Z","lastTransitionTime":"2025-12-06T15:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.965681 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.965755 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.965857 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.965900 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.966032 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:42 crc kubenswrapper[4848]: E1206 15:29:42.966083 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.978686 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:42Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:42 crc kubenswrapper[4848]: I1206 15:29:42.991213 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:42Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.001782 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.011168 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.020541 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.031883 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.035078 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.035239 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.035347 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.035455 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.035540 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:43Z","lastTransitionTime":"2025-12-06T15:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.045998 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.057868 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.105056 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.113769 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.129316 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:23Z\\\",\\\"message\\\":\\\"alse, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"be9dcc9e-c16a-4962-a6d2-4adeb0b929c4\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Swi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.137996 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.138024 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.138035 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.138051 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.138063 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:43Z","lastTransitionTime":"2025-12-06T15:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.140241 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.151619 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.162941 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.172960 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.184051 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:43Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.239977 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.240246 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.240423 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.240513 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.240609 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:43Z","lastTransitionTime":"2025-12-06T15:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.343239 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.343271 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.343281 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.343295 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.343304 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:43Z","lastTransitionTime":"2025-12-06T15:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.445621 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.445648 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.445658 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.445670 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.445679 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:43Z","lastTransitionTime":"2025-12-06T15:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.548137 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.548165 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.548173 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.548187 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.548197 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:43Z","lastTransitionTime":"2025-12-06T15:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.651273 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.651321 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.651330 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.651348 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.651358 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:43Z","lastTransitionTime":"2025-12-06T15:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.754253 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.754328 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.754339 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.754355 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.754364 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:43Z","lastTransitionTime":"2025-12-06T15:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.857414 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.857473 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.857490 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.857513 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.857530 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:43Z","lastTransitionTime":"2025-12-06T15:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.960770 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.960833 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.960857 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.960885 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.960906 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:43Z","lastTransitionTime":"2025-12-06T15:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:43 crc kubenswrapper[4848]: I1206 15:29:43.966071 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:43 crc kubenswrapper[4848]: E1206 15:29:43.966293 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.063423 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.063475 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.063495 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.063520 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.063538 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:44Z","lastTransitionTime":"2025-12-06T15:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.166975 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.167044 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.167057 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.167082 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.167104 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:44Z","lastTransitionTime":"2025-12-06T15:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.269924 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.270001 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.270030 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.270058 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.270083 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:44Z","lastTransitionTime":"2025-12-06T15:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.335569 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.352167 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.353391 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.374007 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.374068 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.373966 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.374088 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.374222 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.374237 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:44Z","lastTransitionTime":"2025-12-06T15:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.395312 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:23Z\\\",\\\"message\\\":\\\"alse, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"be9dcc9e-c16a-4962-a6d2-4adeb0b929c4\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Swi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.411585 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.428141 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.445910 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.467614 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.476747 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.476799 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.476816 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.476834 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.476847 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:44Z","lastTransitionTime":"2025-12-06T15:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.482646 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.494646 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.507753 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.519415 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.530627 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.541576 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.554299 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.565369 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.579668 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.579756 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.579768 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.579785 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.579797 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:44Z","lastTransitionTime":"2025-12-06T15:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.584895 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:44Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.683546 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.683589 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.683600 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.683619 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.683631 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:44Z","lastTransitionTime":"2025-12-06T15:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.786836 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.786884 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.786898 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.786916 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.786928 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:44Z","lastTransitionTime":"2025-12-06T15:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.889363 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.890054 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.890125 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.890167 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.890195 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:44Z","lastTransitionTime":"2025-12-06T15:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.965846 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.965853 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.966120 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:44 crc kubenswrapper[4848]: E1206 15:29:44.966220 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:44 crc kubenswrapper[4848]: E1206 15:29:44.966280 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:44 crc kubenswrapper[4848]: E1206 15:29:44.966382 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.966501 4848 scope.go:117] "RemoveContainer" containerID="6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.993809 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.993844 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.993854 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.993869 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:44 crc kubenswrapper[4848]: I1206 15:29:44.993878 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:44Z","lastTransitionTime":"2025-12-06T15:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.097483 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.097882 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.097899 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.097922 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.097939 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.200388 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.200428 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.200437 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.200451 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.200461 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.204815 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.204858 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.204874 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.204897 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.204914 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: E1206 15:29:45.223159 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.227015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.227049 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.227060 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.227076 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.227089 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: E1206 15:29:45.238673 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.242633 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.242662 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.242673 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.242689 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.242744 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: E1206 15:29:45.253932 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.258070 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.258105 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.258116 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.258133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.258145 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: E1206 15:29:45.269282 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.272449 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.272481 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.272493 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.272519 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.272531 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: E1206 15:29:45.285965 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: E1206 15:29:45.286085 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.302816 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.302846 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.302855 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.302870 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.302880 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.329765 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/1.log" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.332553 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71"} Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.333027 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.355406 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.367357 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.429036 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.429081 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.429092 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.429116 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.429127 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.439800 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.453910 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.471713 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.487714 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.503319 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.514860 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.526450 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6758c7d6-fd0a-486e-914e-65ac5c4cab5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e8b0e830434d7b7b2f12ca82ef784eb9309d331ff9ec8459c08909632c076ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be90f4cb06fb6e6d8fcfe736e1b874905e8b6e8e482120145cfbea106777873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6a5b4eb903385cb761b52b1e4babf4024acf274822be64616d80e66a6c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.530946 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.530985 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.530994 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.531008 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.531019 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.542628 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.553690 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.573252 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:23Z\\\",\\\"message\\\":\\\"alse, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"be9dcc9e-c16a-4962-a6d2-4adeb0b929c4\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Swi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.585274 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.600258 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.611175 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.620751 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.632900 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.632929 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.632938 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.632950 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.632959 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.634300 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:45Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.735564 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.735611 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.735621 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.735637 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.735646 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.838021 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.838068 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.838080 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.838098 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.838110 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.940742 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.940781 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.940790 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.940804 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.940815 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:45Z","lastTransitionTime":"2025-12-06T15:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:45 crc kubenswrapper[4848]: I1206 15:29:45.965670 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:45 crc kubenswrapper[4848]: E1206 15:29:45.965867 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.043022 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.043056 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.043064 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.043078 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.043089 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:46Z","lastTransitionTime":"2025-12-06T15:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.145633 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.145673 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.145683 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.145713 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.145722 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:46Z","lastTransitionTime":"2025-12-06T15:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.247796 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.247843 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.247855 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.247869 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.247880 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:46Z","lastTransitionTime":"2025-12-06T15:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.336482 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/2.log" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.337108 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/1.log" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.339522 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71" exitCode=1 Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.339566 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71"} Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.339603 4848 scope.go:117] "RemoveContainer" containerID="6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.340199 4848 scope.go:117] "RemoveContainer" containerID="ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71" Dec 06 15:29:46 crc kubenswrapper[4848]: E1206 15:29:46.340353 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.351278 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.351314 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.351339 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.351359 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.351373 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:46Z","lastTransitionTime":"2025-12-06T15:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.352643 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.362145 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.373966 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.384786 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.397263 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.405503 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.413863 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.428251 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be118b6cf74094bc320fbce2638beb1d65ce8584b5134cf4817d82aeafd3e2d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:23Z\\\",\\\"message\\\":\\\"alse, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"be9dcc9e-c16a-4962-a6d2-4adeb0b929c4\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string{}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_TCP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Swi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:46Z\\\",\\\"message\\\":\\\"o/client-go/informers/factory.go:160\\\\nI1206 15:29:45.860578 6549 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 15:29:45.861156 6549 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:45.861202 6549 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:45.861227 6549 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1206 15:29:45.861230 6549 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:45.861268 6549 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 15:29:45.861341 6549 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:45.861358 6549 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:45.861368 6549 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:45.861420 6549 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:45.861455 6549 factory.go:656] Stopping watch factory\\\\nI1206 15:29:45.861496 6549 ovnkube.go:599] Stopped ovnkube\\\\nI1206 15:29:45.861495 6549 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:45.861546 6549 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.439200 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.448517 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.453334 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.453371 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.453379 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.453395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.453408 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:46Z","lastTransitionTime":"2025-12-06T15:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.457610 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6758c7d6-fd0a-486e-914e-65ac5c4cab5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e8b0e830434d7b7b2f12ca82ef784eb9309d331ff9ec8459c08909632c076ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be90f4cb06fb6e6d8fcfe736e1b874905e8b6e8e482120145cfbea106777873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6a5b4eb903385cb761b52b1e4babf4024acf274822be64616d80e66a6c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.467999 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.478878 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.491399 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.501313 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.510952 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.523525 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:46Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.555381 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.555415 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.555424 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.555436 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.555445 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:46Z","lastTransitionTime":"2025-12-06T15:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.657930 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.657968 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.657977 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.657990 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.657999 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:46Z","lastTransitionTime":"2025-12-06T15:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.760860 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.760932 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.760956 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.761015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.761038 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:46Z","lastTransitionTime":"2025-12-06T15:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.863591 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.863688 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.863769 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.863800 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.863821 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:46Z","lastTransitionTime":"2025-12-06T15:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.965978 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.966001 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.966001 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:46 crc kubenswrapper[4848]: E1206 15:29:46.966121 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:46 crc kubenswrapper[4848]: E1206 15:29:46.966217 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:46 crc kubenswrapper[4848]: E1206 15:29:46.966324 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.967444 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.967469 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.967477 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.967489 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:46 crc kubenswrapper[4848]: I1206 15:29:46.967498 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:46Z","lastTransitionTime":"2025-12-06T15:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.069820 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.069871 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.069883 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.069898 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.069911 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:47Z","lastTransitionTime":"2025-12-06T15:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.172458 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.172501 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.172509 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.172524 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.172533 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:47Z","lastTransitionTime":"2025-12-06T15:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.275581 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.275621 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.275632 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.275651 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.275664 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:47Z","lastTransitionTime":"2025-12-06T15:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.346730 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/2.log" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.353191 4848 scope.go:117] "RemoveContainer" containerID="ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71" Dec 06 15:29:47 crc kubenswrapper[4848]: E1206 15:29:47.353324 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.373688 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.379265 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.379346 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.379371 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.379405 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.379424 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:47Z","lastTransitionTime":"2025-12-06T15:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.396090 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.413397 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.429435 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.448755 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.466668 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.482588 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.482617 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.482626 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.482643 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.482656 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:47Z","lastTransitionTime":"2025-12-06T15:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.491951 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.510787 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.525450 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.542611 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.554016 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.568614 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.577575 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.584547 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.584598 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.584609 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.584627 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.584717 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:47Z","lastTransitionTime":"2025-12-06T15:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.597139 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:46Z\\\",\\\"message\\\":\\\"o/client-go/informers/factory.go:160\\\\nI1206 15:29:45.860578 6549 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 15:29:45.861156 6549 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:45.861202 6549 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:45.861227 6549 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1206 15:29:45.861230 6549 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:45.861268 6549 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 15:29:45.861341 6549 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:45.861358 6549 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:45.861368 6549 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:45.861420 6549 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:45.861455 6549 factory.go:656] Stopping watch factory\\\\nI1206 15:29:45.861496 6549 ovnkube.go:599] Stopped ovnkube\\\\nI1206 15:29:45.861495 6549 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:45.861546 6549 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.613572 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.625633 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.637789 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6758c7d6-fd0a-486e-914e-65ac5c4cab5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e8b0e830434d7b7b2f12ca82ef784eb9309d331ff9ec8459c08909632c076ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be90f4cb06fb6e6d8fcfe736e1b874905e8b6e8e482120145cfbea106777873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6a5b4eb903385cb761b52b1e4babf4024acf274822be64616d80e66a6c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:47Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.686527 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.686565 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.686574 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.686588 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.686598 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:47Z","lastTransitionTime":"2025-12-06T15:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.789077 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.789125 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.789135 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.789149 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.789159 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:47Z","lastTransitionTime":"2025-12-06T15:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.892352 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.892402 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.892423 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.892444 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.892457 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:47Z","lastTransitionTime":"2025-12-06T15:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.966179 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:47 crc kubenswrapper[4848]: E1206 15:29:47.966285 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.995056 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.995099 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.995108 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.995121 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:47 crc kubenswrapper[4848]: I1206 15:29:47.995129 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:47Z","lastTransitionTime":"2025-12-06T15:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.097033 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.097115 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.097128 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.097147 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.097160 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:48Z","lastTransitionTime":"2025-12-06T15:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.199337 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.199373 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.199384 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.199400 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.199409 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:48Z","lastTransitionTime":"2025-12-06T15:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.301889 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.301929 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.301941 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.301958 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.301970 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:48Z","lastTransitionTime":"2025-12-06T15:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.403859 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.403890 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.403900 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.403916 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.403927 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:48Z","lastTransitionTime":"2025-12-06T15:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.505867 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.506194 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.506379 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.506545 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.506824 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:48Z","lastTransitionTime":"2025-12-06T15:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.610425 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.610462 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.610475 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.610491 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.610502 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:48Z","lastTransitionTime":"2025-12-06T15:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.712380 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.712412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.712421 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.712435 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.712445 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:48Z","lastTransitionTime":"2025-12-06T15:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.814891 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.814950 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.814970 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.814993 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.815011 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:48Z","lastTransitionTime":"2025-12-06T15:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.918150 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.918437 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.918517 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.918604 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.918771 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:48Z","lastTransitionTime":"2025-12-06T15:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.966391 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:48 crc kubenswrapper[4848]: E1206 15:29:48.966654 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.966937 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:48 crc kubenswrapper[4848]: I1206 15:29:48.967084 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:48 crc kubenswrapper[4848]: E1206 15:29:48.967194 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:48 crc kubenswrapper[4848]: E1206 15:29:48.967241 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.022152 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.022362 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.022518 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.022661 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.022878 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:49Z","lastTransitionTime":"2025-12-06T15:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.126948 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.127018 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.127038 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.127067 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.127087 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:49Z","lastTransitionTime":"2025-12-06T15:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.230826 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.231335 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.231522 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.231660 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.231824 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:49Z","lastTransitionTime":"2025-12-06T15:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.334675 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.335117 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.335341 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.335528 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.335677 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:49Z","lastTransitionTime":"2025-12-06T15:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.438998 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.439033 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.439045 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.439060 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.439074 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:49Z","lastTransitionTime":"2025-12-06T15:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.542629 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.542800 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.542939 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.542970 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.542990 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:49Z","lastTransitionTime":"2025-12-06T15:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.646458 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.646531 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.646550 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.646578 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.646600 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:49Z","lastTransitionTime":"2025-12-06T15:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.748897 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.748945 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.748955 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.748973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.748984 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:49Z","lastTransitionTime":"2025-12-06T15:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.852391 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.852447 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.852463 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.852481 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.852493 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:49Z","lastTransitionTime":"2025-12-06T15:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.955315 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.955387 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.955405 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.955432 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.955451 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:49Z","lastTransitionTime":"2025-12-06T15:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:49 crc kubenswrapper[4848]: I1206 15:29:49.965829 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:49 crc kubenswrapper[4848]: E1206 15:29:49.965968 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.057931 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.057973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.057985 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.058000 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.058011 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:50Z","lastTransitionTime":"2025-12-06T15:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.160415 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.160673 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.160765 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.160866 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.160941 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:50Z","lastTransitionTime":"2025-12-06T15:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.263529 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.263567 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.263577 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.263592 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.263602 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:50Z","lastTransitionTime":"2025-12-06T15:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.364835 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.364887 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.364902 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.364921 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.364934 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:50Z","lastTransitionTime":"2025-12-06T15:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.466829 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.467125 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.467304 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.467491 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.467740 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:50Z","lastTransitionTime":"2025-12-06T15:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.570619 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.570721 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.570735 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.570757 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.570770 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:50Z","lastTransitionTime":"2025-12-06T15:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.673323 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.673349 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.673358 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.673371 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.673380 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:50Z","lastTransitionTime":"2025-12-06T15:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.776009 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.776242 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.776308 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.776415 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.776485 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:50Z","lastTransitionTime":"2025-12-06T15:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.879186 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.879243 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.879264 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.879293 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.879343 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:50Z","lastTransitionTime":"2025-12-06T15:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.966103 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.966216 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:50 crc kubenswrapper[4848]: E1206 15:29:50.966223 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:50 crc kubenswrapper[4848]: E1206 15:29:50.966295 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.966103 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:50 crc kubenswrapper[4848]: E1206 15:29:50.966503 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.981567 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.981793 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.981880 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.981971 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:50 crc kubenswrapper[4848]: I1206 15:29:50.982063 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:50Z","lastTransitionTime":"2025-12-06T15:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.084424 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.084457 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.084466 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.084477 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.084486 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:51Z","lastTransitionTime":"2025-12-06T15:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.186980 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.187027 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.187044 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.187061 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.187073 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:51Z","lastTransitionTime":"2025-12-06T15:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.289714 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.289753 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.289764 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.289780 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.289790 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:51Z","lastTransitionTime":"2025-12-06T15:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.392753 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.392813 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.392830 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.392857 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.392891 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:51Z","lastTransitionTime":"2025-12-06T15:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.495807 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.495872 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.495894 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.495922 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.495944 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:51Z","lastTransitionTime":"2025-12-06T15:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.599207 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.599258 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.599273 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.599295 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.599312 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:51Z","lastTransitionTime":"2025-12-06T15:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.702224 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.702292 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.702315 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.702346 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.702367 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:51Z","lastTransitionTime":"2025-12-06T15:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.805086 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.805146 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.805163 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.805185 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.805201 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:51Z","lastTransitionTime":"2025-12-06T15:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.908811 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.908873 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.908894 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.908923 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.908948 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:51Z","lastTransitionTime":"2025-12-06T15:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:51 crc kubenswrapper[4848]: I1206 15:29:51.965852 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:51 crc kubenswrapper[4848]: E1206 15:29:51.966043 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.011827 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.011875 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.011890 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.011910 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.011926 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:52Z","lastTransitionTime":"2025-12-06T15:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.114500 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.114567 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.114591 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.114617 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.114637 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:52Z","lastTransitionTime":"2025-12-06T15:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.217678 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.217733 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.217745 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.217761 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.217771 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:52Z","lastTransitionTime":"2025-12-06T15:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.319972 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.320060 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.320076 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.320099 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.320110 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:52Z","lastTransitionTime":"2025-12-06T15:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.422819 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.422857 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.422867 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.422882 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.422893 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:52Z","lastTransitionTime":"2025-12-06T15:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.525133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.525169 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.525179 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.525195 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.525205 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:52Z","lastTransitionTime":"2025-12-06T15:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.627554 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.627586 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.627594 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.627607 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.627616 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:52Z","lastTransitionTime":"2025-12-06T15:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.730016 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.730082 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.730100 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.730124 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.730141 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:52Z","lastTransitionTime":"2025-12-06T15:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.833244 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.833281 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.833290 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.833304 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.833313 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:52Z","lastTransitionTime":"2025-12-06T15:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.935276 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.935319 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.935359 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.935377 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.935387 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:52Z","lastTransitionTime":"2025-12-06T15:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.965776 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.965817 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.965776 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:52 crc kubenswrapper[4848]: E1206 15:29:52.965926 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:52 crc kubenswrapper[4848]: E1206 15:29:52.966022 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:52 crc kubenswrapper[4848]: E1206 15:29:52.967303 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.983592 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:52Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:52 crc kubenswrapper[4848]: I1206 15:29:52.995547 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:52Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.005904 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.022189 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.035176 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.038615 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.038644 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.038652 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.038665 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.038674 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:53Z","lastTransitionTime":"2025-12-06T15:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.048725 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.064320 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.075453 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.091916 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:46Z\\\",\\\"message\\\":\\\"o/client-go/informers/factory.go:160\\\\nI1206 15:29:45.860578 6549 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 15:29:45.861156 6549 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:45.861202 6549 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:45.861227 6549 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1206 15:29:45.861230 6549 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:45.861268 6549 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 15:29:45.861341 6549 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:45.861358 6549 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:45.861368 6549 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:45.861420 6549 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:45.861455 6549 factory.go:656] Stopping watch factory\\\\nI1206 15:29:45.861496 6549 ovnkube.go:599] Stopped ovnkube\\\\nI1206 15:29:45.861495 6549 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:45.861546 6549 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.105429 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.116141 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.126537 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6758c7d6-fd0a-486e-914e-65ac5c4cab5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e8b0e830434d7b7b2f12ca82ef784eb9309d331ff9ec8459c08909632c076ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be90f4cb06fb6e6d8fcfe736e1b874905e8b6e8e482120145cfbea106777873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6a5b4eb903385cb761b52b1e4babf4024acf274822be64616d80e66a6c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.137820 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.143349 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.143398 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.143410 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.143425 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.143437 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:53Z","lastTransitionTime":"2025-12-06T15:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.147789 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.159804 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.171810 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.181795 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:53Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.245806 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.246019 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.246122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.246189 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.246326 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:53Z","lastTransitionTime":"2025-12-06T15:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.348897 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.349152 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.349186 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.349201 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.349209 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:53Z","lastTransitionTime":"2025-12-06T15:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.451212 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.451418 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.451485 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.451599 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.451686 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:53Z","lastTransitionTime":"2025-12-06T15:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.554322 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.554608 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.554718 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.554801 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.554894 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:53Z","lastTransitionTime":"2025-12-06T15:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.657584 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.657617 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.657624 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.657637 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.657646 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:53Z","lastTransitionTime":"2025-12-06T15:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.760869 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.761168 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.761300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.761447 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.761534 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:53Z","lastTransitionTime":"2025-12-06T15:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.865791 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.865891 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.865916 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.865948 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.865971 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:53Z","lastTransitionTime":"2025-12-06T15:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.966286 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:53 crc kubenswrapper[4848]: E1206 15:29:53.966485 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.969050 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.969203 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.969229 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.969253 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:53 crc kubenswrapper[4848]: I1206 15:29:53.969279 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:53Z","lastTransitionTime":"2025-12-06T15:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.071090 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.071124 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.071132 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.071144 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.071153 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:54Z","lastTransitionTime":"2025-12-06T15:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.172732 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.172777 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.172791 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.172808 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.172839 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:54Z","lastTransitionTime":"2025-12-06T15:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.275649 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.275777 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.275802 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.275832 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.275852 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:54Z","lastTransitionTime":"2025-12-06T15:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.379463 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.379556 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.379575 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.379599 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.379616 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:54Z","lastTransitionTime":"2025-12-06T15:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.483573 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.483640 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.483657 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.483679 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.483721 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:54Z","lastTransitionTime":"2025-12-06T15:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.585887 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.586271 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.586415 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.586570 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.586680 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:54Z","lastTransitionTime":"2025-12-06T15:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.689042 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.689092 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.689105 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.689122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.689136 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:54Z","lastTransitionTime":"2025-12-06T15:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.792167 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.792479 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.792672 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.792928 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.793125 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:54Z","lastTransitionTime":"2025-12-06T15:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.896567 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.896759 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.896852 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.896943 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.897044 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:54Z","lastTransitionTime":"2025-12-06T15:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.965886 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:54 crc kubenswrapper[4848]: E1206 15:29:54.966055 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.966502 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:54 crc kubenswrapper[4848]: E1206 15:29:54.966764 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.969360 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:54 crc kubenswrapper[4848]: E1206 15:29:54.969616 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.998987 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.999026 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.999036 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.999052 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:54 crc kubenswrapper[4848]: I1206 15:29:54.999061 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:54Z","lastTransitionTime":"2025-12-06T15:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.101307 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.101361 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.101373 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.101389 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.101399 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.203593 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.203647 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.203656 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.203670 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.203678 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.306638 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.306674 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.306686 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.306717 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.306730 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.408819 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.409052 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.409133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.409212 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.409280 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.511506 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.511792 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.511872 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.511959 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.512045 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.538819 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.538879 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.538893 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.538911 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.538923 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: E1206 15:29:55.551987 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:55Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.555378 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.555507 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.555598 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.555680 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.555822 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: E1206 15:29:55.575145 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:55Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.580324 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.580375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.580389 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.580407 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.580425 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: E1206 15:29:55.592856 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:55Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.596145 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.596187 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.596199 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.596216 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.596229 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: E1206 15:29:55.610061 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:55Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.613266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.613304 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.613314 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.613328 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.613338 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: E1206 15:29:55.623890 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:29:55Z is after 2025-08-24T17:21:41Z" Dec 06 15:29:55 crc kubenswrapper[4848]: E1206 15:29:55.624041 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.625505 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.625605 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.625618 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.625636 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.625648 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.728063 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.728115 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.728127 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.728140 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.728150 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.830676 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.830978 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.831049 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.831117 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.831187 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.933437 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.933473 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.933482 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.933496 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.933504 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:55Z","lastTransitionTime":"2025-12-06T15:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:55 crc kubenswrapper[4848]: I1206 15:29:55.966559 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:55 crc kubenswrapper[4848]: E1206 15:29:55.966683 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.035648 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.035676 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.035684 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.035709 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.035718 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:56Z","lastTransitionTime":"2025-12-06T15:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.138058 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.138095 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.138103 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.138118 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.138128 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:56Z","lastTransitionTime":"2025-12-06T15:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.239984 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.240035 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.240049 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.240073 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.240089 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:56Z","lastTransitionTime":"2025-12-06T15:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.343324 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.343363 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.343375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.343392 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.343403 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:56Z","lastTransitionTime":"2025-12-06T15:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.445481 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.445533 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.445545 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.445565 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.445588 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:56Z","lastTransitionTime":"2025-12-06T15:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.547788 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.547841 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.547853 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.547869 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.547881 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:56Z","lastTransitionTime":"2025-12-06T15:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.650217 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.650256 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.650268 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.650289 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.650301 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:56Z","lastTransitionTime":"2025-12-06T15:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.752604 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.752644 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.752653 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.752667 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.752675 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:56Z","lastTransitionTime":"2025-12-06T15:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.855385 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.855429 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.855440 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.855459 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.855469 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:56Z","lastTransitionTime":"2025-12-06T15:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.958101 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.958149 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.958158 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.958175 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.958187 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:56Z","lastTransitionTime":"2025-12-06T15:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.965643 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.965662 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:56 crc kubenswrapper[4848]: I1206 15:29:56.965686 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:56 crc kubenswrapper[4848]: E1206 15:29:56.965769 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:56 crc kubenswrapper[4848]: E1206 15:29:56.965835 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:56 crc kubenswrapper[4848]: E1206 15:29:56.965905 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.060955 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.061022 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.061045 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.061073 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.061104 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:57Z","lastTransitionTime":"2025-12-06T15:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.163512 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.163569 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.163581 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.163596 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.163606 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:57Z","lastTransitionTime":"2025-12-06T15:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.266789 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.266842 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.266853 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.266870 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.266883 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:57Z","lastTransitionTime":"2025-12-06T15:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.369353 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.369394 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.369403 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.369420 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.369429 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:57Z","lastTransitionTime":"2025-12-06T15:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.472143 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.472180 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.472191 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.472205 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.472215 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:57Z","lastTransitionTime":"2025-12-06T15:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.574350 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.574415 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.574438 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.574467 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.574489 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:57Z","lastTransitionTime":"2025-12-06T15:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.677859 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.677896 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.677907 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.677923 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.677934 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:57Z","lastTransitionTime":"2025-12-06T15:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.780413 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.780449 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.780461 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.780474 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.780485 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:57Z","lastTransitionTime":"2025-12-06T15:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.883607 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.883650 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.883661 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.883678 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.883709 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:57Z","lastTransitionTime":"2025-12-06T15:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.966087 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:57 crc kubenswrapper[4848]: E1206 15:29:57.966215 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.985884 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.985961 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.985985 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.986014 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:57 crc kubenswrapper[4848]: I1206 15:29:57.986033 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:57Z","lastTransitionTime":"2025-12-06T15:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.087780 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.087821 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.087831 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.087845 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.087858 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:58Z","lastTransitionTime":"2025-12-06T15:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.189958 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.190003 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.190014 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.190029 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.190040 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:58Z","lastTransitionTime":"2025-12-06T15:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.292581 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.292643 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.292659 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.292679 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.292691 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:58Z","lastTransitionTime":"2025-12-06T15:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.394743 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.394787 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.394800 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.394816 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.394828 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:58Z","lastTransitionTime":"2025-12-06T15:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.412212 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:58 crc kubenswrapper[4848]: E1206 15:29:58.412373 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:58 crc kubenswrapper[4848]: E1206 15:29:58.412440 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs podName:0f6acd83-a70e-4a34-96a5-ea7bd9e95935 nodeName:}" failed. No retries permitted until 2025-12-06 15:30:30.412422422 +0000 UTC m=+97.710433335 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs") pod "network-metrics-daemon-v4dm4" (UID: "0f6acd83-a70e-4a34-96a5-ea7bd9e95935") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.496902 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.496939 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.496949 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.497004 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.497016 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:58Z","lastTransitionTime":"2025-12-06T15:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.598932 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.599017 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.599032 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.599048 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.599059 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:58Z","lastTransitionTime":"2025-12-06T15:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.702008 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.702061 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.702077 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.702097 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.702111 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:58Z","lastTransitionTime":"2025-12-06T15:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.804446 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.804526 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.804548 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.804580 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.804599 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:58Z","lastTransitionTime":"2025-12-06T15:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.907235 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.907276 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.907284 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.907298 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.907307 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:58Z","lastTransitionTime":"2025-12-06T15:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.966152 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.966243 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:29:58 crc kubenswrapper[4848]: I1206 15:29:58.966252 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:29:58 crc kubenswrapper[4848]: E1206 15:29:58.966340 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:29:58 crc kubenswrapper[4848]: E1206 15:29:58.966437 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:29:58 crc kubenswrapper[4848]: E1206 15:29:58.966519 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.009854 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.009892 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.009900 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.009914 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.009923 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:59Z","lastTransitionTime":"2025-12-06T15:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.112195 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.112239 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.112250 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.112266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.112277 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:59Z","lastTransitionTime":"2025-12-06T15:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.214470 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.214508 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.214521 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.214535 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.214546 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:59Z","lastTransitionTime":"2025-12-06T15:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.316837 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.316876 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.316886 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.316900 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.316909 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:59Z","lastTransitionTime":"2025-12-06T15:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.418986 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.419019 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.419028 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.419041 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.419051 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:59Z","lastTransitionTime":"2025-12-06T15:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.521306 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.521350 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.521358 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.521372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.521385 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:59Z","lastTransitionTime":"2025-12-06T15:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.623455 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.623495 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.623504 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.623519 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.623531 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:59Z","lastTransitionTime":"2025-12-06T15:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.725767 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.725815 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.725823 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.725838 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.725848 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:59Z","lastTransitionTime":"2025-12-06T15:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.828521 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.828600 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.828610 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.828627 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.828636 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:59Z","lastTransitionTime":"2025-12-06T15:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.931254 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.931286 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.931294 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.931305 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.931314 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:29:59Z","lastTransitionTime":"2025-12-06T15:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:29:59 crc kubenswrapper[4848]: I1206 15:29:59.965947 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:29:59 crc kubenswrapper[4848]: E1206 15:29:59.966054 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.036332 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.036372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.036384 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.036400 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.036412 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:00Z","lastTransitionTime":"2025-12-06T15:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.139094 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.139151 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.139161 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.139179 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.139191 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:00Z","lastTransitionTime":"2025-12-06T15:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.241442 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.241477 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.241487 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.241500 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.241509 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:00Z","lastTransitionTime":"2025-12-06T15:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.344195 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.344316 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.344344 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.344375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.344396 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:00Z","lastTransitionTime":"2025-12-06T15:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.393148 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qx6m8_9c409d16-f97d-4bcd-bf25-b80af1b16922/kube-multus/0.log" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.393205 4848 generic.go:334] "Generic (PLEG): container finished" podID="9c409d16-f97d-4bcd-bf25-b80af1b16922" containerID="56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541" exitCode=1 Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.393238 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qx6m8" event={"ID":"9c409d16-f97d-4bcd-bf25-b80af1b16922","Type":"ContainerDied","Data":"56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541"} Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.393609 4848 scope.go:117] "RemoveContainer" containerID="56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.406744 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6758c7d6-fd0a-486e-914e-65ac5c4cab5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e8b0e830434d7b7b2f12ca82ef784eb9309d331ff9ec8459c08909632c076ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be90f4cb06fb6e6d8fcfe736e1b874905e8b6e8e482120145cfbea106777873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6a5b4eb903385cb761b52b1e4babf4024acf274822be64616d80e66a6c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.418008 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.427606 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.447836 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:46Z\\\",\\\"message\\\":\\\"o/client-go/informers/factory.go:160\\\\nI1206 15:29:45.860578 6549 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 15:29:45.861156 6549 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:45.861202 6549 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:45.861227 6549 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1206 15:29:45.861230 6549 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:45.861268 6549 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 15:29:45.861341 6549 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:45.861358 6549 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:45.861368 6549 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:45.861420 6549 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:45.861455 6549 factory.go:656] Stopping watch factory\\\\nI1206 15:29:45.861496 6549 ovnkube.go:599] Stopped ovnkube\\\\nI1206 15:29:45.861495 6549 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:45.861546 6549 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.448172 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.448212 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.448224 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.448243 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.448253 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:00Z","lastTransitionTime":"2025-12-06T15:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.461882 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.473904 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.488888 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.500203 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.511002 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.522518 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.537595 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:59Z\\\",\\\"message\\\":\\\"2025-12-06T15:29:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6fb63c7-69a5-49f5-b64b-87c763cafec0\\\\n2025-12-06T15:29:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6fb63c7-69a5-49f5-b64b-87c763cafec0 to /host/opt/cni/bin/\\\\n2025-12-06T15:29:14Z [verbose] multus-daemon started\\\\n2025-12-06T15:29:14Z [verbose] Readiness Indicator file check\\\\n2025-12-06T15:29:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.551199 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.551251 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.551263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.551278 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.551291 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:00Z","lastTransitionTime":"2025-12-06T15:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.554770 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.572333 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.586182 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.602462 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.617677 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.633771 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:00Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.653584 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.653612 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.653623 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.653636 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.653645 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:00Z","lastTransitionTime":"2025-12-06T15:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.756276 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.756329 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.756339 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.756355 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.756364 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:00Z","lastTransitionTime":"2025-12-06T15:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.858574 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.858619 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.858631 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.858653 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.858683 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:00Z","lastTransitionTime":"2025-12-06T15:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.960593 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.960882 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.960963 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.961038 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.961107 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:00Z","lastTransitionTime":"2025-12-06T15:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.966151 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:00 crc kubenswrapper[4848]: E1206 15:30:00.966390 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.966199 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:00 crc kubenswrapper[4848]: I1206 15:30:00.966186 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:00 crc kubenswrapper[4848]: E1206 15:30:00.967101 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:30:00 crc kubenswrapper[4848]: E1206 15:30:00.967282 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.064109 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.064163 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.064176 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.064192 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.064206 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:01Z","lastTransitionTime":"2025-12-06T15:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.167241 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.167525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.167628 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.167774 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.168048 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:01Z","lastTransitionTime":"2025-12-06T15:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.270810 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.270863 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.270876 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.270896 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.270910 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:01Z","lastTransitionTime":"2025-12-06T15:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.373067 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.373104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.373112 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.373125 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.373170 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:01Z","lastTransitionTime":"2025-12-06T15:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.397348 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qx6m8_9c409d16-f97d-4bcd-bf25-b80af1b16922/kube-multus/0.log" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.397651 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qx6m8" event={"ID":"9c409d16-f97d-4bcd-bf25-b80af1b16922","Type":"ContainerStarted","Data":"04bd10779b2e6d35c9c3deb96ec020ab03381619f7bc56bc994363a684bee55b"} Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.408788 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.419043 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6758c7d6-fd0a-486e-914e-65ac5c4cab5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e8b0e830434d7b7b2f12ca82ef784eb9309d331ff9ec8459c08909632c076ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be90f4cb06fb6e6d8fcfe736e1b874905e8b6e8e482120145cfbea106777873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6a5b4eb903385cb761b52b1e4babf4024acf274822be64616d80e66a6c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.430308 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.440055 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.458494 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:46Z\\\",\\\"message\\\":\\\"o/client-go/informers/factory.go:160\\\\nI1206 15:29:45.860578 6549 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 15:29:45.861156 6549 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:45.861202 6549 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:45.861227 6549 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1206 15:29:45.861230 6549 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:45.861268 6549 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 15:29:45.861341 6549 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:45.861358 6549 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:45.861368 6549 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:45.861420 6549 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:45.861455 6549 factory.go:656] Stopping watch factory\\\\nI1206 15:29:45.861496 6549 ovnkube.go:599] Stopped ovnkube\\\\nI1206 15:29:45.861495 6549 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:45.861546 6549 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.473270 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.475339 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.475375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.475386 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.475402 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.475414 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:01Z","lastTransitionTime":"2025-12-06T15:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.484797 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.494617 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.505360 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.514834 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.524521 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04bd10779b2e6d35c9c3deb96ec020ab03381619f7bc56bc994363a684bee55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:59Z\\\",\\\"message\\\":\\\"2025-12-06T15:29:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6fb63c7-69a5-49f5-b64b-87c763cafec0\\\\n2025-12-06T15:29:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6fb63c7-69a5-49f5-b64b-87c763cafec0 to /host/opt/cni/bin/\\\\n2025-12-06T15:29:14Z [verbose] multus-daemon started\\\\n2025-12-06T15:29:14Z [verbose] Readiness Indicator file check\\\\n2025-12-06T15:29:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:30:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.536550 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.547664 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.557003 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.565517 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.575745 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.577218 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.577244 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.577260 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.577273 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.577281 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:01Z","lastTransitionTime":"2025-12-06T15:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.585955 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:01Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.679790 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.679826 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.679834 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.679850 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.679861 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:01Z","lastTransitionTime":"2025-12-06T15:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.781998 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.782034 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.782043 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.782057 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.782066 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:01Z","lastTransitionTime":"2025-12-06T15:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.884207 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.884261 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.884273 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.884292 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.884304 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:01Z","lastTransitionTime":"2025-12-06T15:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.966128 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:01 crc kubenswrapper[4848]: E1206 15:30:01.966613 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.966864 4848 scope.go:117] "RemoveContainer" containerID="ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71" Dec 06 15:30:01 crc kubenswrapper[4848]: E1206 15:30:01.967061 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.986018 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.986044 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.986052 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.986064 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:01 crc kubenswrapper[4848]: I1206 15:30:01.986073 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:01Z","lastTransitionTime":"2025-12-06T15:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.088560 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.088600 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.088614 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.088635 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.088647 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:02Z","lastTransitionTime":"2025-12-06T15:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.190315 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.190351 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.190362 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.190375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.190388 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:02Z","lastTransitionTime":"2025-12-06T15:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.292973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.293028 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.293038 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.293055 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.293065 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:02Z","lastTransitionTime":"2025-12-06T15:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.395309 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.395338 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.395346 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.395362 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.395371 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:02Z","lastTransitionTime":"2025-12-06T15:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.500674 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.500762 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.500797 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.500819 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.500831 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:02Z","lastTransitionTime":"2025-12-06T15:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.602713 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.602750 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.602759 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.602772 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.602780 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:02Z","lastTransitionTime":"2025-12-06T15:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.704795 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.705057 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.705142 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.705208 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.705305 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:02Z","lastTransitionTime":"2025-12-06T15:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.808023 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.808619 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.809029 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.809418 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.809793 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:02Z","lastTransitionTime":"2025-12-06T15:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.912609 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.912646 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.912654 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.912668 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.912676 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:02Z","lastTransitionTime":"2025-12-06T15:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.965585 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:02 crc kubenswrapper[4848]: E1206 15:30:02.965752 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.965842 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:02 crc kubenswrapper[4848]: E1206 15:30:02.965899 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.966034 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:02 crc kubenswrapper[4848]: E1206 15:30:02.966096 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:30:02 crc kubenswrapper[4848]: I1206 15:30:02.983544 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:02Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.000812 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04bd10779b2e6d35c9c3deb96ec020ab03381619f7bc56bc994363a684bee55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:59Z\\\",\\\"message\\\":\\\"2025-12-06T15:29:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6fb63c7-69a5-49f5-b64b-87c763cafec0\\\\n2025-12-06T15:29:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6fb63c7-69a5-49f5-b64b-87c763cafec0 to /host/opt/cni/bin/\\\\n2025-12-06T15:29:14Z [verbose] multus-daemon started\\\\n2025-12-06T15:29:14Z [verbose] Readiness Indicator file check\\\\n2025-12-06T15:29:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:30:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:02Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.015193 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.015245 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.015258 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.015276 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.015289 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:03Z","lastTransitionTime":"2025-12-06T15:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.015967 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.030320 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.044299 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.057355 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b907644-0c15-4494-a36e-b97960b3ab69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12066fadd5408753f9318ebc27ac9f1c5b32bc3edd4e505d18aeb54ad29c6123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9f07ce709d63c063c1735253583d56c00b64734c0886e5e4ac429b2a89dc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7t5bc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-96ljk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.070246 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ds2c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v4dm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.085167 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.095961 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.106476 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075a8b28a3d924b88494a0f0384fe9127359d12e5567473525fd25a0d9fd873e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.115622 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qk7cq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71fffe2-9e92-49ff-9a8e-6b08e2946b23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb5cb7f3b0a29e17833f33e3b6c694e50fdeb801d6d0c590e94c40fed3be2c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pmv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qk7cq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.117267 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.117312 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.117321 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.117334 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.117345 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:03Z","lastTransitionTime":"2025-12-06T15:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.124265 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b5fcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb75de0-7ecd-4b1a-8322-51af09d62176\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1b44da55debf32cd4dc896acaee2f2face751cf9bf0c1c73d6f4c775a2b622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96lgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b5fcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.144719 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:46Z\\\",\\\"message\\\":\\\"o/client-go/informers/factory.go:160\\\\nI1206 15:29:45.860578 6549 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 15:29:45.861156 6549 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 15:29:45.861202 6549 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 15:29:45.861227 6549 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1206 15:29:45.861230 6549 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 15:29:45.861268 6549 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 15:29:45.861341 6549 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 15:29:45.861358 6549 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 15:29:45.861368 6549 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 15:29:45.861420 6549 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 15:29:45.861455 6549 factory.go:656] Stopping watch factory\\\\nI1206 15:29:45.861496 6549 ovnkube.go:599] Stopped ovnkube\\\\nI1206 15:29:45.861495 6549 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 15:29:45.861546 6549 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8g4jc_openshift-ovn-kubernetes(9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zznnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8g4jc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.157411 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6dad9b9-172f-494c-adb1-da5c45b89ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e60f4265cafd1d67f5d36b4c89e8de2cbea0e77aa59cc7037dc2a4a62ed5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b0d036eab5f622896233d90a5ad366b17479415bd06f4cac14cf200cae2cbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://510312ccae5f4d76904a8e60da0fab8669b2ba2f0e418624b6d64feb0c9acfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5ccd0da3ba4b63bc46cdc4bf1f4fa27d017babc1e976a7e1af99b91e2ba591b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e69c916f80f7ebad6382a77291b02f352885cf3169526205defb672b1565e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc72ac1c8e6b5f469041e79e8b02d0471792a1b7be19260c4ae328e00bd827d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19f27105c66727a4500a806d0d152a18b10a57d91040b8cb5dcc56f20e1ea4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:29:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k855v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zmmx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.169409 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f70168e-bafd-4e55-b148-ba3a24d9ab2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139e0cbf7368d61c030fcb69d2d1ec6033ba46071a1bc3db36da368ade318aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://209eda56da117443177af5bc5a3e08ef2b8b576c746f8d1ad5d5c5431e15c944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc660dd9ac858cbeedc67d05ddca69a329ff039cfb65bd44f4b3727bb368481\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.179358 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6758c7d6-fd0a-486e-914e-65ac5c4cab5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e8b0e830434d7b7b2f12ca82ef784eb9309d331ff9ec8459c08909632c076ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be90f4cb06fb6e6d8fcfe736e1b874905e8b6e8e482120145cfbea106777873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6a5b4eb903385cb761b52b1e4babf4024acf274822be64616d80e66a6c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://522f7ac7d144ffe3f3f0291f09fe45046b37cd0909f905198053f0400f19391a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.189222 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a1a849a492de247fbff4cd6f6e09c422b791a8ba1376abf80eb3ec2762947e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9385a6fc9518e4260c8afe6f0495b1655e3af85d2cdc76d9db6d89ac6193953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:03Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.219476 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.219514 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.219525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.219538 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.219548 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:03Z","lastTransitionTime":"2025-12-06T15:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.321299 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.321338 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.321349 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.321364 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.321375 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:03Z","lastTransitionTime":"2025-12-06T15:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.423432 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.424007 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.424078 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.424163 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.424227 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:03Z","lastTransitionTime":"2025-12-06T15:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.526448 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.526497 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.526509 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.526525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.526535 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:03Z","lastTransitionTime":"2025-12-06T15:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.629375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.629648 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.629781 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.629876 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.629959 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:03Z","lastTransitionTime":"2025-12-06T15:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.733324 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.733596 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.733677 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.733812 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.733932 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:03Z","lastTransitionTime":"2025-12-06T15:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.836538 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.836841 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.836945 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.837037 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.837124 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:03Z","lastTransitionTime":"2025-12-06T15:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.939256 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.939309 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.939321 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.939338 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.939351 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:03Z","lastTransitionTime":"2025-12-06T15:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:03 crc kubenswrapper[4848]: I1206 15:30:03.965970 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:03 crc kubenswrapper[4848]: E1206 15:30:03.966115 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.041133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.041202 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.041214 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.041228 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.041238 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:04Z","lastTransitionTime":"2025-12-06T15:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.143846 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.143884 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.143894 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.143909 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.143918 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:04Z","lastTransitionTime":"2025-12-06T15:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.245924 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.245952 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.245960 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.245973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.245984 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:04Z","lastTransitionTime":"2025-12-06T15:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.348162 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.348199 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.348207 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.348219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.348230 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:04Z","lastTransitionTime":"2025-12-06T15:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.450030 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.450078 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.450089 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.450105 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.450117 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:04Z","lastTransitionTime":"2025-12-06T15:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.552459 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.552503 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.552511 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.552525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.552535 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:04Z","lastTransitionTime":"2025-12-06T15:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.654508 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.654564 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.654587 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.654609 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.654623 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:04Z","lastTransitionTime":"2025-12-06T15:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.757115 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.757184 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.757200 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.757221 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.757234 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:04Z","lastTransitionTime":"2025-12-06T15:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.859273 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.859319 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.859332 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.859352 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.859362 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:04Z","lastTransitionTime":"2025-12-06T15:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.962076 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.962127 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.962138 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.962156 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.962168 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:04Z","lastTransitionTime":"2025-12-06T15:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.965395 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.965419 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:04 crc kubenswrapper[4848]: E1206 15:30:04.965488 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:30:04 crc kubenswrapper[4848]: I1206 15:30:04.965607 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:04 crc kubenswrapper[4848]: E1206 15:30:04.965647 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:30:04 crc kubenswrapper[4848]: E1206 15:30:04.965796 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.063777 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.063811 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.063819 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.063831 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.063841 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.166477 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.166540 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.166564 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.166587 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.166604 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.269109 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.269149 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.269158 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.269189 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.269202 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.371256 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.371306 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.371316 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.371330 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.371341 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.473412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.473450 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.473463 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.473479 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.473542 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.576056 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.576095 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.576104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.576116 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.576126 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.678128 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.678221 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.678294 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.678324 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.678341 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.780414 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.780458 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.780466 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.780480 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.780489 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.880517 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.880563 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.880575 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.880591 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.880602 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: E1206 15:30:05.894967 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:05Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.898892 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.898994 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.899016 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.899030 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.899058 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: E1206 15:30:05.911824 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:05Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.915270 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.915300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.915311 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.915327 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.915338 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: E1206 15:30:05.929434 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:05Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.933778 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.933822 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.933834 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.933851 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.933863 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: E1206 15:30:05.947421 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:05Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.951765 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.951801 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.951812 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.951827 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.951839 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: E1206 15:30:05.962680 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160f41fb-e7dc-4cf7-b6e3-c75b9c86b1c5\\\",\\\"systemUUID\\\":\\\"fdce5a22-c98f-4909-8c21-e3a12013664f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:05Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:05 crc kubenswrapper[4848]: E1206 15:30:05.962861 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.964498 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.964527 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.964537 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.964550 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.964561 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:05Z","lastTransitionTime":"2025-12-06T15:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:05 crc kubenswrapper[4848]: I1206 15:30:05.965830 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:05 crc kubenswrapper[4848]: E1206 15:30:05.965948 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.067452 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.067488 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.067499 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.067517 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.067530 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:06Z","lastTransitionTime":"2025-12-06T15:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.180383 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.180431 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.180442 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.180460 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.180469 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:06Z","lastTransitionTime":"2025-12-06T15:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.282894 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.282972 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.282997 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.283031 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.283055 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:06Z","lastTransitionTime":"2025-12-06T15:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.386837 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.386896 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.386915 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.386938 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.386956 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:06Z","lastTransitionTime":"2025-12-06T15:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.490984 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.491067 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.491093 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.491123 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.491148 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:06Z","lastTransitionTime":"2025-12-06T15:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.594116 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.594186 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.594197 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.594214 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.594225 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:06Z","lastTransitionTime":"2025-12-06T15:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.696635 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.696963 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.696997 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.697011 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.697019 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:06Z","lastTransitionTime":"2025-12-06T15:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.799912 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.799983 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.800003 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.800019 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.800030 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:06Z","lastTransitionTime":"2025-12-06T15:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.902625 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.902670 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.902684 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.902717 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.902761 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:06Z","lastTransitionTime":"2025-12-06T15:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.966186 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.966207 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:06 crc kubenswrapper[4848]: E1206 15:30:06.966305 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:30:06 crc kubenswrapper[4848]: E1206 15:30:06.966518 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:30:06 crc kubenswrapper[4848]: I1206 15:30:06.966546 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:06 crc kubenswrapper[4848]: E1206 15:30:06.966672 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.005090 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.005137 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.005150 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.005165 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.005175 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:07Z","lastTransitionTime":"2025-12-06T15:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.108244 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.108288 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.108304 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.108328 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.108346 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:07Z","lastTransitionTime":"2025-12-06T15:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.212017 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.212133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.212164 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.212207 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.212251 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:07Z","lastTransitionTime":"2025-12-06T15:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.315420 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.315537 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.315560 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.315592 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.315615 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:07Z","lastTransitionTime":"2025-12-06T15:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.418741 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.418809 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.418830 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.418859 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.418881 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:07Z","lastTransitionTime":"2025-12-06T15:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.521809 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.521877 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.521893 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.521913 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.521935 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:07Z","lastTransitionTime":"2025-12-06T15:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.624472 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.624532 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.624543 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.624562 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.624574 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:07Z","lastTransitionTime":"2025-12-06T15:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.727300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.727395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.727423 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.727457 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.727480 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:07Z","lastTransitionTime":"2025-12-06T15:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.829918 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.829966 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.829982 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.829999 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.830011 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:07Z","lastTransitionTime":"2025-12-06T15:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.933193 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.933262 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.933285 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.933314 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.933335 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:07Z","lastTransitionTime":"2025-12-06T15:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:07 crc kubenswrapper[4848]: I1206 15:30:07.965528 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:07 crc kubenswrapper[4848]: E1206 15:30:07.965675 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.036636 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.036683 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.036707 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.036723 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.036734 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:08Z","lastTransitionTime":"2025-12-06T15:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.138386 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.138467 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.138489 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.138521 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.138543 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:08Z","lastTransitionTime":"2025-12-06T15:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.241488 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.241527 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.241537 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.241551 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.241561 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:08Z","lastTransitionTime":"2025-12-06T15:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.343436 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.343476 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.343487 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.343504 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.343515 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:08Z","lastTransitionTime":"2025-12-06T15:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.446081 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.446112 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.446120 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.446133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.446141 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:08Z","lastTransitionTime":"2025-12-06T15:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.548039 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.548081 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.548090 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.548105 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.548116 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:08Z","lastTransitionTime":"2025-12-06T15:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.650797 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.650859 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.650877 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.650903 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.650920 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:08Z","lastTransitionTime":"2025-12-06T15:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.754398 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.754518 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.754543 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.754573 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.754597 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:08Z","lastTransitionTime":"2025-12-06T15:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.857831 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.857900 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.857921 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.857957 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.857996 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:08Z","lastTransitionTime":"2025-12-06T15:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.961312 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.961368 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.961383 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.961402 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.961417 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:08Z","lastTransitionTime":"2025-12-06T15:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.965969 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.966009 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:08 crc kubenswrapper[4848]: I1206 15:30:08.966070 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:08 crc kubenswrapper[4848]: E1206 15:30:08.966159 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:30:08 crc kubenswrapper[4848]: E1206 15:30:08.966305 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:30:08 crc kubenswrapper[4848]: E1206 15:30:08.966339 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.065050 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.065129 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.065151 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.065174 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.065191 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:09Z","lastTransitionTime":"2025-12-06T15:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.168221 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.168274 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.168291 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.168314 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.168330 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:09Z","lastTransitionTime":"2025-12-06T15:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.271371 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.271434 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.271453 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.271477 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.271494 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:09Z","lastTransitionTime":"2025-12-06T15:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.373574 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.373649 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.373681 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.373743 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.373761 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:09Z","lastTransitionTime":"2025-12-06T15:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.476435 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.476516 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.476549 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.476585 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.476609 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:09Z","lastTransitionTime":"2025-12-06T15:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.579250 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.579309 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.579328 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.579356 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.579374 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:09Z","lastTransitionTime":"2025-12-06T15:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.681452 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.681501 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.681512 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.681525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.681536 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:09Z","lastTransitionTime":"2025-12-06T15:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.783301 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.783357 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.783375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.783395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.783410 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:09Z","lastTransitionTime":"2025-12-06T15:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.885977 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.886019 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.886042 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.886068 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.886086 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:09Z","lastTransitionTime":"2025-12-06T15:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.966111 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:09 crc kubenswrapper[4848]: E1206 15:30:09.966237 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.988718 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.988749 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.988756 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.988769 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:09 crc kubenswrapper[4848]: I1206 15:30:09.988780 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:09Z","lastTransitionTime":"2025-12-06T15:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.091554 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.091595 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.091604 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.091618 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.091793 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:10Z","lastTransitionTime":"2025-12-06T15:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.194009 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.194055 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.194067 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.194083 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.194097 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:10Z","lastTransitionTime":"2025-12-06T15:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.296083 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.296121 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.296132 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.296149 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.296161 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:10Z","lastTransitionTime":"2025-12-06T15:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.398953 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.399012 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.399031 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.399054 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.399071 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:10Z","lastTransitionTime":"2025-12-06T15:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.502514 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.502581 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.502598 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.502625 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.502643 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:10Z","lastTransitionTime":"2025-12-06T15:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.605219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.605263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.605278 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.605298 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.605311 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:10Z","lastTransitionTime":"2025-12-06T15:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.708011 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.708079 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.708096 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.708121 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.708140 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:10Z","lastTransitionTime":"2025-12-06T15:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.810644 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.810752 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.810780 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.810811 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.810828 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:10Z","lastTransitionTime":"2025-12-06T15:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.913254 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.913312 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.913330 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.913355 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.913371 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:10Z","lastTransitionTime":"2025-12-06T15:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.966250 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.966348 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:10 crc kubenswrapper[4848]: E1206 15:30:10.966408 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:30:10 crc kubenswrapper[4848]: I1206 15:30:10.966454 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:10 crc kubenswrapper[4848]: E1206 15:30:10.966513 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:30:10 crc kubenswrapper[4848]: E1206 15:30:10.966755 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.016690 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.016797 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.016810 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.016828 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.017146 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:11Z","lastTransitionTime":"2025-12-06T15:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.120514 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.120558 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.120568 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.120610 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.120622 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:11Z","lastTransitionTime":"2025-12-06T15:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.224303 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.224413 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.224430 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.224450 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.224463 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:11Z","lastTransitionTime":"2025-12-06T15:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.327688 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.327758 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.327772 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.327791 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.327800 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:11Z","lastTransitionTime":"2025-12-06T15:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.429795 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.429882 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.429907 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.429936 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.429956 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:11Z","lastTransitionTime":"2025-12-06T15:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.532938 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.533014 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.533034 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.533065 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.533088 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:11Z","lastTransitionTime":"2025-12-06T15:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.635456 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.635513 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.635527 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.635545 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.635560 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:11Z","lastTransitionTime":"2025-12-06T15:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.739924 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.740012 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.740032 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.740061 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.740085 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:11Z","lastTransitionTime":"2025-12-06T15:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.843099 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.843133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.843143 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.843155 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.843163 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:11Z","lastTransitionTime":"2025-12-06T15:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.945219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.945276 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.945294 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.945318 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.945340 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:11Z","lastTransitionTime":"2025-12-06T15:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:11 crc kubenswrapper[4848]: I1206 15:30:11.965590 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:11 crc kubenswrapper[4848]: E1206 15:30:11.965800 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.047195 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.047236 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.047248 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.047261 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.047270 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:12Z","lastTransitionTime":"2025-12-06T15:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.155173 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.155252 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.155271 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.155300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.155327 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:12Z","lastTransitionTime":"2025-12-06T15:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.257979 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.258020 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.258032 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.258048 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.258057 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:12Z","lastTransitionTime":"2025-12-06T15:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.361203 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.361314 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.361333 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.361358 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.361375 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:12Z","lastTransitionTime":"2025-12-06T15:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.464260 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.464327 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.464343 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.464365 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.464381 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:12Z","lastTransitionTime":"2025-12-06T15:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.566721 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.566772 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.566784 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.566802 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.566814 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:12Z","lastTransitionTime":"2025-12-06T15:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.670799 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.670853 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.670869 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.670892 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.670909 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:12Z","lastTransitionTime":"2025-12-06T15:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.774113 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.774186 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.774204 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.774230 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.774249 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:12Z","lastTransitionTime":"2025-12-06T15:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.876968 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.877014 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.877030 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.877046 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.877057 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:12Z","lastTransitionTime":"2025-12-06T15:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.965643 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.965643 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:12 crc kubenswrapper[4848]: E1206 15:30:12.965873 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.965909 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:12 crc kubenswrapper[4848]: E1206 15:30:12.966864 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:30:12 crc kubenswrapper[4848]: E1206 15:30:12.966954 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.980198 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.980262 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.980282 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.980307 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.980329 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:12Z","lastTransitionTime":"2025-12-06T15:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:12 crc kubenswrapper[4848]: I1206 15:30:12.985080 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a556f222afb6496e8ab9b65756522a07e5e1c7c6fc161bac3206a75f8e49ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:12Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.005342 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.026456 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.043985 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc8499a5-41f5-49e8-a206-3240532ec6a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452884c2661a17df060b6d3fefb13bb082c2596cc34a1745b4d4941c26874a9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pl9fs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mrg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.064501 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qx6m8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c409d16-f97d-4bcd-bf25-b80af1b16922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:30:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04bd10779b2e6d35c9c3deb96ec020ab03381619f7bc56bc994363a684bee55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T15:29:59Z\\\",\\\"message\\\":\\\"2025-12-06T15:29:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6fb63c7-69a5-49f5-b64b-87c763cafec0\\\\n2025-12-06T15:29:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6fb63c7-69a5-49f5-b64b-87c763cafec0 to /host/opt/cni/bin/\\\\n2025-12-06T15:29:14Z [verbose] multus-daemon started\\\\n2025-12-06T15:29:14Z [verbose] Readiness Indicator file check\\\\n2025-12-06T15:29:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:30:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9kr6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qx6m8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.081215 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4088dff3-91f6-41f3-afad-2b6bc1cefe21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 15:29:05.311051 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 15:29:05.314064 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-416920503/tls.crt::/tmp/serving-cert-416920503/tls.key\\\\\\\"\\\\nI1206 15:29:10.855439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 15:29:10.857925 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 15:29:10.858855 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 15:29:10.858950 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 15:29:10.858958 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 15:29:10.866036 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 15:29:10.866067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 15:29:10.866075 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866083 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 15:29:10.866088 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 15:29:10.866091 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 15:29:10.866094 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 15:29:10.866098 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 15:29:10.869127 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T15:28:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T15:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T15:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T15:28:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.082886 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.082955 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.082969 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.082987 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.083028 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:13Z","lastTransitionTime":"2025-12-06T15:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.094121 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T15:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T15:30:13Z is after 2025-08-24T17:21:41Z" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.131161 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qk7cq" podStartSLOduration=62.131133091 podStartE2EDuration="1m2.131133091s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:13.130828772 +0000 UTC m=+80.428839725" watchObservedRunningTime="2025-12-06 15:30:13.131133091 +0000 UTC m=+80.429144034" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.167168 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-96ljk" podStartSLOduration=61.167150391 podStartE2EDuration="1m1.167150391s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:13.147798282 +0000 UTC m=+80.445809205" watchObservedRunningTime="2025-12-06 15:30:13.167150391 +0000 UTC m=+80.465161304" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.185413 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.185443 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.185451 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.185463 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.185473 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:13Z","lastTransitionTime":"2025-12-06T15:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.203061 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zmmx7" podStartSLOduration=62.203032086 podStartE2EDuration="1m2.203032086s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:13.188763435 +0000 UTC m=+80.486774348" watchObservedRunningTime="2025-12-06 15:30:13.203032086 +0000 UTC m=+80.501043029" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.224421 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=62.224387412 podStartE2EDuration="1m2.224387412s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:13.203405237 +0000 UTC m=+80.501416180" watchObservedRunningTime="2025-12-06 15:30:13.224387412 +0000 UTC m=+80.522398365" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.224796 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=29.224786073 podStartE2EDuration="29.224786073s" podCreationTimestamp="2025-12-06 15:29:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:13.2243002 +0000 UTC m=+80.522311113" watchObservedRunningTime="2025-12-06 15:30:13.224786073 +0000 UTC m=+80.522797026" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.257860 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b5fcj" podStartSLOduration=62.257841237 podStartE2EDuration="1m2.257841237s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:13.257534589 +0000 UTC m=+80.555545502" watchObservedRunningTime="2025-12-06 15:30:13.257841237 +0000 UTC m=+80.555852150" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.287393 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.287430 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.287440 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.287455 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.287465 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:13Z","lastTransitionTime":"2025-12-06T15:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.389619 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.389673 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.389690 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.389732 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.389748 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:13Z","lastTransitionTime":"2025-12-06T15:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.491643 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.491685 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.491718 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.491738 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.491752 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:13Z","lastTransitionTime":"2025-12-06T15:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.593595 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.593629 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.593638 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.593653 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.593664 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:13Z","lastTransitionTime":"2025-12-06T15:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.695660 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.695734 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.695747 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.695762 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.695773 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:13Z","lastTransitionTime":"2025-12-06T15:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.797891 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.798293 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.798437 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.798619 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.798766 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:13Z","lastTransitionTime":"2025-12-06T15:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.900778 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.901149 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.901374 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.901579 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.901817 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:13Z","lastTransitionTime":"2025-12-06T15:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:13 crc kubenswrapper[4848]: I1206 15:30:13.966512 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:13 crc kubenswrapper[4848]: E1206 15:30:13.967245 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.004175 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.004571 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.004778 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.004936 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.005085 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:14Z","lastTransitionTime":"2025-12-06T15:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.107687 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.107770 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.107781 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.107798 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.107810 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:14Z","lastTransitionTime":"2025-12-06T15:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.210591 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.210757 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.210777 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.210802 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.210822 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:14Z","lastTransitionTime":"2025-12-06T15:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.314973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.315060 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.315097 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.315155 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.315180 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:14Z","lastTransitionTime":"2025-12-06T15:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.418299 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.418370 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.418386 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.418407 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.418423 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:14Z","lastTransitionTime":"2025-12-06T15:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.522129 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.522221 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.522247 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.522275 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.522296 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:14Z","lastTransitionTime":"2025-12-06T15:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.624513 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.624586 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.624612 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.624642 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.624667 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:14Z","lastTransitionTime":"2025-12-06T15:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.727143 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.727207 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.727222 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.727240 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.727251 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:14Z","lastTransitionTime":"2025-12-06T15:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.806531 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.806775 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:31:18.806731387 +0000 UTC m=+146.104742340 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.806871 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.806958 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.807099 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.807158 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.807207 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:31:18.80718591 +0000 UTC m=+146.105196853 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.807238 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 15:31:18.807225281 +0000 UTC m=+146.105236224 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.829965 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.830038 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.830065 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.830100 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.830124 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:14Z","lastTransitionTime":"2025-12-06T15:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.908765 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.908894 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.909110 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.909157 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.909188 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.909224 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.909290 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.909310 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.909316 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 15:31:18.909273025 +0000 UTC m=+146.207283978 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.909420 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 15:31:18.909387279 +0000 UTC m=+146.207398352 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.933942 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.933996 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.934009 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.934030 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.934042 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:14Z","lastTransitionTime":"2025-12-06T15:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.966666 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.966669 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.966942 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.966982 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.967107 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:30:14 crc kubenswrapper[4848]: E1206 15:30:14.967544 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:30:14 crc kubenswrapper[4848]: I1206 15:30:14.967892 4848 scope.go:117] "RemoveContainer" containerID="ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.037003 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.037445 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.037455 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.037470 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.037480 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:15Z","lastTransitionTime":"2025-12-06T15:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.139801 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.139884 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.139908 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.139943 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.139967 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:15Z","lastTransitionTime":"2025-12-06T15:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.243724 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.243785 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.243798 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.243819 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.243832 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:15Z","lastTransitionTime":"2025-12-06T15:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.346957 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.347056 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.347083 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.347121 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.347154 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:15Z","lastTransitionTime":"2025-12-06T15:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.450491 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.450549 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.450568 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.450591 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.450608 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:15Z","lastTransitionTime":"2025-12-06T15:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.553403 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.553685 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.553792 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.553857 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.553915 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:15Z","lastTransitionTime":"2025-12-06T15:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.656625 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.656767 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.656795 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.656830 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.656854 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:15Z","lastTransitionTime":"2025-12-06T15:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.760571 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.761415 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.761486 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.761557 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.761628 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:15Z","lastTransitionTime":"2025-12-06T15:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.865070 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.865137 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.865159 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.865183 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.865202 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:15Z","lastTransitionTime":"2025-12-06T15:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.965859 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:15 crc kubenswrapper[4848]: E1206 15:30:15.966025 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.974838 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.974866 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.974878 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.974891 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:15 crc kubenswrapper[4848]: I1206 15:30:15.974902 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:15Z","lastTransitionTime":"2025-12-06T15:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.013857 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.013895 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.013902 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.013916 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.013926 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T15:30:16Z","lastTransitionTime":"2025-12-06T15:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.051255 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4"] Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.051684 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.054471 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.054601 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.054601 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.055188 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.090604 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qx6m8" podStartSLOduration=65.09058536 podStartE2EDuration="1m5.09058536s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:16.077912634 +0000 UTC m=+83.375923547" watchObservedRunningTime="2025-12-06 15:30:16.09058536 +0000 UTC m=+83.388596273" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.126837 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/158a5bf6-01b2-472e-b26a-da4211d2d33e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.126873 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/158a5bf6-01b2-472e-b26a-da4211d2d33e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.126905 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158a5bf6-01b2-472e-b26a-da4211d2d33e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.126921 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/158a5bf6-01b2-472e-b26a-da4211d2d33e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.127024 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/158a5bf6-01b2-472e-b26a-da4211d2d33e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.135871 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podStartSLOduration=65.135852526 podStartE2EDuration="1m5.135852526s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:16.135686241 +0000 UTC m=+83.433697164" watchObservedRunningTime="2025-12-06 15:30:16.135852526 +0000 UTC m=+83.433863439" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.154001 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=65.153972389 podStartE2EDuration="1m5.153972389s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:16.153466304 +0000 UTC m=+83.451477257" watchObservedRunningTime="2025-12-06 15:30:16.153972389 +0000 UTC m=+83.451983302" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.227790 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158a5bf6-01b2-472e-b26a-da4211d2d33e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.227830 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/158a5bf6-01b2-472e-b26a-da4211d2d33e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.227896 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/158a5bf6-01b2-472e-b26a-da4211d2d33e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.227926 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/158a5bf6-01b2-472e-b26a-da4211d2d33e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.227949 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/158a5bf6-01b2-472e-b26a-da4211d2d33e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.228007 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/158a5bf6-01b2-472e-b26a-da4211d2d33e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.228029 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/158a5bf6-01b2-472e-b26a-da4211d2d33e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.228817 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/158a5bf6-01b2-472e-b26a-da4211d2d33e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.236213 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158a5bf6-01b2-472e-b26a-da4211d2d33e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.250937 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/158a5bf6-01b2-472e-b26a-da4211d2d33e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gz5c4\" (UID: \"158a5bf6-01b2-472e-b26a-da4211d2d33e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.364939 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.450796 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/2.log" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.454134 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerStarted","Data":"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24"} Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.454799 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.457654 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" event={"ID":"158a5bf6-01b2-472e-b26a-da4211d2d33e","Type":"ContainerStarted","Data":"bb435b02736cf00a20ee327762dec10dd820dcd940018eec6109610adc74e040"} Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.968857 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.968918 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:16 crc kubenswrapper[4848]: E1206 15:30:16.969027 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:30:16 crc kubenswrapper[4848]: I1206 15:30:16.969044 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:16 crc kubenswrapper[4848]: E1206 15:30:16.969198 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:30:16 crc kubenswrapper[4848]: E1206 15:30:16.969294 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:30:17 crc kubenswrapper[4848]: I1206 15:30:17.036257 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podStartSLOduration=65.036237974 podStartE2EDuration="1m5.036237974s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:16.51517159 +0000 UTC m=+83.813182503" watchObservedRunningTime="2025-12-06 15:30:17.036237974 +0000 UTC m=+84.334248887" Dec 06 15:30:17 crc kubenswrapper[4848]: I1206 15:30:17.036898 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v4dm4"] Dec 06 15:30:17 crc kubenswrapper[4848]: I1206 15:30:17.036982 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:17 crc kubenswrapper[4848]: E1206 15:30:17.037057 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:30:17 crc kubenswrapper[4848]: I1206 15:30:17.461987 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" event={"ID":"158a5bf6-01b2-472e-b26a-da4211d2d33e","Type":"ContainerStarted","Data":"e6c148612a6a33f627d65e704dd5fc1cbcea2ad8b9c0da68830a75a02c19d7bd"} Dec 06 15:30:17 crc kubenswrapper[4848]: I1206 15:30:17.473056 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gz5c4" podStartSLOduration=66.473039957 podStartE2EDuration="1m6.473039957s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:17.472815971 +0000 UTC m=+84.770826884" watchObservedRunningTime="2025-12-06 15:30:17.473039957 +0000 UTC m=+84.771050870" Dec 06 15:30:17 crc kubenswrapper[4848]: I1206 15:30:17.977087 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 06 15:30:18 crc kubenswrapper[4848]: I1206 15:30:18.966008 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:18 crc kubenswrapper[4848]: I1206 15:30:18.966056 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:18 crc kubenswrapper[4848]: E1206 15:30:18.966108 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 15:30:18 crc kubenswrapper[4848]: E1206 15:30:18.966171 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 15:30:18 crc kubenswrapper[4848]: I1206 15:30:18.966235 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:18 crc kubenswrapper[4848]: E1206 15:30:18.966296 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4dm4" podUID="0f6acd83-a70e-4a34-96a5-ea7bd9e95935" Dec 06 15:30:18 crc kubenswrapper[4848]: I1206 15:30:18.966447 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:18 crc kubenswrapper[4848]: E1206 15:30:18.966515 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.167866 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.168059 4848 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.205078 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.205650 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.208812 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.209307 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.210203 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.211544 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-65225"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.211990 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.212635 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.212692 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.212958 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7h7ps"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.213367 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.214442 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8jq72"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.215038 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.215811 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.217000 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nhnnj"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.217394 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.218602 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.218845 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.219294 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.219888 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.220596 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.220924 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.220965 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.220990 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.221163 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.223527 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.223644 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.225100 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.225371 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.225520 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.225770 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.225892 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.225921 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.226002 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.226010 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.226083 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.226102 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.226620 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.226725 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.226806 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.226920 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.227199 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.227401 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.227481 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.227580 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.227918 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.228018 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.228163 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4nmrw"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.228687 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.229732 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.229810 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.230386 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.231424 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.231928 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.232213 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ftn2g"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.233107 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ftn2g" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.233721 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cjw6f"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.233822 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.234597 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.250180 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.254735 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.254767 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.261316 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.261548 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.261570 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bghdr\" (UniqueName: \"kubernetes.io/projected/6c86a6b9-9082-4011-80ae-1e145c2580c1-kube-api-access-bghdr\") pod \"cluster-samples-operator-665b6dd947-nrf55\" (UID: \"6c86a6b9-9082-4011-80ae-1e145c2580c1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.261727 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c86a6b9-9082-4011-80ae-1e145c2580c1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nrf55\" (UID: \"6c86a6b9-9082-4011-80ae-1e145c2580c1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.262055 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.262482 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.262790 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.263143 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.263215 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-mdg75"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.263354 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.263549 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.263632 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.263713 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.263766 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.263796 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.263857 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.263965 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264020 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264052 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264101 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264121 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264167 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264256 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264436 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264462 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264500 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264544 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264739 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264840 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264886 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.264858 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.265102 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.265469 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.265798 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.265894 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.266081 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.270481 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.270978 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.272413 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.272613 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.272720 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.272823 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.273034 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.273284 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.273815 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.274384 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.274932 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.275259 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.275575 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.276266 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.276795 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.276985 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.277087 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-szkmh"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.277250 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.277598 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.281176 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.281679 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.285353 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.306353 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bksdf"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.321193 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.322353 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-68b26"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.322875 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.323224 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-68b26" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.325111 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.325712 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.325902 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.326032 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.326244 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.326355 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.326455 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.326659 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.326661 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.326910 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.327038 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.327125 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.327805 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.328736 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.331214 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.331387 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.331883 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.332046 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.332991 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.334683 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.334854 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56mp5"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.335400 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.335794 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.335884 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.336139 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.341459 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.341996 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.343209 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.346919 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4nmrw"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.347047 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.348191 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.349525 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.349543 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.349647 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.355918 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.357168 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.358381 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7h7ps"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.359455 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362710 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0171f176-2943-4c53-b1f5-03bd6fec2a01-serving-cert\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362742 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14799933-c7a1-4fd1-9ab2-7d2d9bfb645d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xrdhr\" (UID: \"14799933-c7a1-4fd1-9ab2-7d2d9bfb645d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362762 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6e4efa-abe1-44da-9528-73fe113e016a-config\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362781 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6hq8\" (UniqueName: \"kubernetes.io/projected/239825af-9f47-45a4-933c-e60a3f4f54ad-kube-api-access-x6hq8\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362796 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0171f176-2943-4c53-b1f5-03bd6fec2a01-node-pullsecrets\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362811 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w7st\" (UniqueName: \"kubernetes.io/projected/f250df39-ff33-455c-9edc-cb1997a8c782-kube-api-access-5w7st\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362830 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c457951c-652c-4d1a-8478-507cfe00cd41-serving-cert\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362845 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362862 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-etcd-serving-ca\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362877 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwtqv\" (UniqueName: \"kubernetes.io/projected/861c6ec8-def2-40b1-8b82-c99e683232ec-kube-api-access-hwtqv\") pod \"migrator-59844c95c7-c5lt8\" (UID: \"861c6ec8-def2-40b1-8b82-c99e683232ec\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362892 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30cecc00-7a51-4ab5-b23e-399221fd73d8-config\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362910 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30cecc00-7a51-4ab5-b23e-399221fd73d8-serving-cert\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362926 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5de67fc-ba65-4752-b10a-86149771384a-serving-cert\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362941 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-dir\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362956 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362973 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7pd7\" (UniqueName: \"kubernetes.io/projected/4a6e4efa-abe1-44da-9528-73fe113e016a-kube-api-access-x7pd7\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.362989 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/239825af-9f47-45a4-933c-e60a3f4f54ad-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363008 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363025 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363041 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-console-config\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363059 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02495680-6e6d-4771-9cb8-8e3324e7c1c2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-79fvk\" (UID: \"02495680-6e6d-4771-9cb8-8e3324e7c1c2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363075 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0171f176-2943-4c53-b1f5-03bd6fec2a01-audit-dir\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363092 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhpj4\" (UniqueName: \"kubernetes.io/projected/30cecc00-7a51-4ab5-b23e-399221fd73d8-kube-api-access-zhpj4\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363110 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627a6345-6c21-4e20-bba4-5e9a30d2cb86-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2wcz4\" (UID: \"627a6345-6c21-4e20-bba4-5e9a30d2cb86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363126 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-service-ca\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363141 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/239825af-9f47-45a4-933c-e60a3f4f54ad-trusted-ca\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363159 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-config\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363174 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-client-ca\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363191 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06fa2713-32ed-4992-8024-48cfb318926d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nz6tt\" (UID: \"06fa2713-32ed-4992-8024-48cfb318926d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363206 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-audit\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363220 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c457951c-652c-4d1a-8478-507cfe00cd41-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363236 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwg85\" (UniqueName: \"kubernetes.io/projected/e5fee749-2cc4-41ea-9b22-7499624ae892-kube-api-access-xwg85\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363250 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s85d6\" (UniqueName: \"kubernetes.io/projected/55929ace-2800-4b03-a017-d8aad4949e5b-kube-api-access-s85d6\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363265 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02495680-6e6d-4771-9cb8-8e3324e7c1c2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-79fvk\" (UID: \"02495680-6e6d-4771-9cb8-8e3324e7c1c2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363290 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bghdr\" (UniqueName: \"kubernetes.io/projected/6c86a6b9-9082-4011-80ae-1e145c2580c1-kube-api-access-bghdr\") pod \"cluster-samples-operator-665b6dd947-nrf55\" (UID: \"6c86a6b9-9082-4011-80ae-1e145c2580c1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363312 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79983c07-34c7-4539-958e-73168bfa669d-config\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363334 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w69z7\" (UniqueName: \"kubernetes.io/projected/1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a-kube-api-access-w69z7\") pod \"downloads-7954f5f757-ftn2g\" (UID: \"1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a\") " pod="openshift-console/downloads-7954f5f757-ftn2g" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363354 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1d6d035-9129-459e-b913-95a87f868196-serving-cert\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363375 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f9090da-5bf9-4966-b42e-63de136da026-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363395 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55929ace-2800-4b03-a017-d8aad4949e5b-trusted-ca\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363416 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363436 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n5sm\" (UniqueName: \"kubernetes.io/projected/c457951c-652c-4d1a-8478-507cfe00cd41-kube-api-access-4n5sm\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363458 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-config\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363477 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gqz8\" (UniqueName: \"kubernetes.io/projected/d97d4b84-731c-4def-b9a9-a436e63f6f67-kube-api-access-2gqz8\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363501 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5d2w\" (UniqueName: \"kubernetes.io/projected/02495680-6e6d-4771-9cb8-8e3324e7c1c2-kube-api-access-k5d2w\") pod \"openshift-apiserver-operator-796bbdcf4f-79fvk\" (UID: \"02495680-6e6d-4771-9cb8-8e3324e7c1c2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363734 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb9vw\" (UniqueName: \"kubernetes.io/projected/00b1bde8-f54b-4f0a-af8b-3b2e95066b05-kube-api-access-lb9vw\") pod \"dns-operator-744455d44c-68b26\" (UID: \"00b1bde8-f54b-4f0a-af8b-3b2e95066b05\") " pod="openshift-dns-operator/dns-operator-744455d44c-68b26" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363756 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14799933-c7a1-4fd1-9ab2-7d2d9bfb645d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xrdhr\" (UID: \"14799933-c7a1-4fd1-9ab2-7d2d9bfb645d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363777 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363818 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f9090da-5bf9-4966-b42e-63de136da026-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363834 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55929ace-2800-4b03-a017-d8aad4949e5b-serving-cert\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363852 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c86a6b9-9082-4011-80ae-1e145c2580c1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nrf55\" (UID: \"6c86a6b9-9082-4011-80ae-1e145c2580c1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363867 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg7g7\" (UniqueName: \"kubernetes.io/projected/b5de67fc-ba65-4752-b10a-86149771384a-kube-api-access-hg7g7\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363882 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c457951c-652c-4d1a-8478-507cfe00cd41-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363896 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55929ace-2800-4b03-a017-d8aad4949e5b-config\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363924 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c457951c-652c-4d1a-8478-507cfe00cd41-etcd-client\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363941 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bz8q\" (UniqueName: \"kubernetes.io/projected/a9cbe0f4-d9df-4be5-a1b8-4c271ce24649-kube-api-access-5bz8q\") pod \"openshift-config-operator-7777fb866f-fr5gw\" (UID: \"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363959 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0171f176-2943-4c53-b1f5-03bd6fec2a01-etcd-client\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.363976 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-stats-auth\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364000 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c457951c-652c-4d1a-8478-507cfe00cd41-audit-policies\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364016 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0171f176-2943-4c53-b1f5-03bd6fec2a01-encryption-config\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364033 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-oauth-config\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364050 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a6e4efa-abe1-44da-9528-73fe113e016a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364408 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c457951c-652c-4d1a-8478-507cfe00cd41-audit-dir\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364484 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14799933-c7a1-4fd1-9ab2-7d2d9bfb645d-config\") pod \"kube-controller-manager-operator-78b949d7b-xrdhr\" (UID: \"14799933-c7a1-4fd1-9ab2-7d2d9bfb645d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364515 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364570 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364660 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a9cbe0f4-d9df-4be5-a1b8-4c271ce24649-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fr5gw\" (UID: \"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364757 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-image-import-ca\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364816 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvvcs\" (UniqueName: \"kubernetes.io/projected/3f9090da-5bf9-4966-b42e-63de136da026-kube-api-access-mvvcs\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364870 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00b1bde8-f54b-4f0a-af8b-3b2e95066b05-metrics-tls\") pod \"dns-operator-744455d44c-68b26\" (UID: \"00b1bde8-f54b-4f0a-af8b-3b2e95066b05\") " pod="openshift-dns-operator/dns-operator-744455d44c-68b26" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364903 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-config\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364952 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d97d4b84-731c-4def-b9a9-a436e63f6f67-auth-proxy-config\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.364980 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-trusted-ca-bundle\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365152 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhh7n\" (UniqueName: \"kubernetes.io/projected/b1d6d035-9129-459e-b913-95a87f868196-kube-api-access-vhh7n\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365217 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-client-ca\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365275 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9cbe0f4-d9df-4be5-a1b8-4c271ce24649-serving-cert\") pod \"openshift-config-operator-7777fb866f-fr5gw\" (UID: \"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365310 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79983c07-34c7-4539-958e-73168bfa669d-serving-cert\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365367 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365390 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq7hx\" (UniqueName: \"kubernetes.io/projected/627a6345-6c21-4e20-bba4-5e9a30d2cb86-kube-api-access-mq7hx\") pod \"openshift-controller-manager-operator-756b6f6bc6-2wcz4\" (UID: \"627a6345-6c21-4e20-bba4-5e9a30d2cb86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365466 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-default-certificate\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365525 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365548 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/79983c07-34c7-4539-958e-73168bfa669d-etcd-service-ca\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365590 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79983c07-34c7-4539-958e-73168bfa669d-etcd-client\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365674 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06fa2713-32ed-4992-8024-48cfb318926d-config\") pod \"kube-apiserver-operator-766d6c64bb-nz6tt\" (UID: \"06fa2713-32ed-4992-8024-48cfb318926d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365732 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf6fn\" (UniqueName: \"kubernetes.io/projected/0171f176-2943-4c53-b1f5-03bd6fec2a01-kube-api-access-wf6fn\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365756 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30cecc00-7a51-4ab5-b23e-399221fd73d8-service-ca-bundle\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365803 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06fa2713-32ed-4992-8024-48cfb318926d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nz6tt\" (UID: \"06fa2713-32ed-4992-8024-48cfb318926d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365829 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d97d4b84-731c-4def-b9a9-a436e63f6f67-machine-approver-tls\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.365853 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c457951c-652c-4d1a-8478-507cfe00cd41-encryption-config\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.366089 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f9090da-5bf9-4966-b42e-63de136da026-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.366152 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30cecc00-7a51-4ab5-b23e-399221fd73d8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.366177 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-policies\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.366223 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spg7s\" (UniqueName: \"kubernetes.io/projected/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-kube-api-access-spg7s\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.366263 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/627a6345-6c21-4e20-bba4-5e9a30d2cb86-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2wcz4\" (UID: \"627a6345-6c21-4e20-bba4-5e9a30d2cb86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.368950 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d97d4b84-731c-4def-b9a9-a436e63f6f67-config\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369233 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-oauth-serving-cert\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369264 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/239825af-9f47-45a4-933c-e60a3f4f54ad-metrics-tls\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369295 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369327 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/79983c07-34c7-4539-958e-73168bfa669d-etcd-ca\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369348 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvg4z\" (UniqueName: \"kubernetes.io/projected/79983c07-34c7-4539-958e-73168bfa669d-kube-api-access-zvg4z\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369371 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369392 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-metrics-certs\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369413 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369434 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-service-ca-bundle\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369456 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-serving-cert\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369477 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4a6e4efa-abe1-44da-9528-73fe113e016a-images\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.369747 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8jq72"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.370094 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.371810 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-65225"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.372149 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.372673 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.373036 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.373418 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.377797 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c86a6b9-9082-4011-80ae-1e145c2580c1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nrf55\" (UID: \"6c86a6b9-9082-4011-80ae-1e145c2580c1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.382607 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.383432 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.388143 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kptkr"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.389083 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.389277 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.389785 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.390361 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.407548 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5gkxw"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.408203 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.408528 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.413057 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.417776 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nrffz"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.419163 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.422072 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.422672 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.423456 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.423672 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.424989 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.425137 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.427088 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.428868 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.428950 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.429158 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.429999 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.430466 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.431009 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.431374 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.431784 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.433101 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bksdf"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.434641 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.435268 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.437293 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cjw6f"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.438182 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ftn2g"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.439447 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56mp5"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.440235 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.441314 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mdg75"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.442154 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-4hnzh"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.442673 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4hnzh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.443122 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w8gt5"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.444529 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kptkr"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.444624 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.444991 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.446374 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.447108 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nhnnj"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.447633 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.448376 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-68b26"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.449174 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.450077 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.450977 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nrffz"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.451898 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.453143 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.454004 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.456795 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r25sj"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.457237 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r25sj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.457641 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zq7wp"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.458546 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.458599 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.459446 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.460528 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.461534 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r25sj"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.462440 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.463482 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.467408 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5gkxw"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.467461 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zq7wp"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.470991 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.471645 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.471907 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/239825af-9f47-45a4-933c-e60a3f4f54ad-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.471936 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02495680-6e6d-4771-9cb8-8e3324e7c1c2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-79fvk\" (UID: \"02495680-6e6d-4771-9cb8-8e3324e7c1c2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.471968 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0171f176-2943-4c53-b1f5-03bd6fec2a01-audit-dir\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472003 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627a6345-6c21-4e20-bba4-5e9a30d2cb86-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2wcz4\" (UID: \"627a6345-6c21-4e20-bba4-5e9a30d2cb86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472039 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-service-ca\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472065 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-client-ca\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472090 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06fa2713-32ed-4992-8024-48cfb318926d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nz6tt\" (UID: \"06fa2713-32ed-4992-8024-48cfb318926d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472092 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0171f176-2943-4c53-b1f5-03bd6fec2a01-audit-dir\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472115 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwg85\" (UniqueName: \"kubernetes.io/projected/e5fee749-2cc4-41ea-9b22-7499624ae892-kube-api-access-xwg85\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472172 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s85d6\" (UniqueName: \"kubernetes.io/projected/55929ace-2800-4b03-a017-d8aad4949e5b-kube-api-access-s85d6\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472205 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02495680-6e6d-4771-9cb8-8e3324e7c1c2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-79fvk\" (UID: \"02495680-6e6d-4771-9cb8-8e3324e7c1c2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472234 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fcc11b1f-45ba-455f-a805-781f5512ebd2-tmpfs\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472263 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79983c07-34c7-4539-958e-73168bfa669d-config\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472288 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w69z7\" (UniqueName: \"kubernetes.io/projected/1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a-kube-api-access-w69z7\") pod \"downloads-7954f5f757-ftn2g\" (UID: \"1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a\") " pod="openshift-console/downloads-7954f5f757-ftn2g" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472313 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1d6d035-9129-459e-b913-95a87f868196-serving-cert\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472339 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472365 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f9090da-5bf9-4966-b42e-63de136da026-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472389 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n5sm\" (UniqueName: \"kubernetes.io/projected/c457951c-652c-4d1a-8478-507cfe00cd41-kube-api-access-4n5sm\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472412 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gqz8\" (UniqueName: \"kubernetes.io/projected/d97d4b84-731c-4def-b9a9-a436e63f6f67-kube-api-access-2gqz8\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472440 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14799933-c7a1-4fd1-9ab2-7d2d9bfb645d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xrdhr\" (UID: \"14799933-c7a1-4fd1-9ab2-7d2d9bfb645d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472478 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f9090da-5bf9-4966-b42e-63de136da026-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472523 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c457951c-652c-4d1a-8478-507cfe00cd41-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472547 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55929ace-2800-4b03-a017-d8aad4949e5b-config\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472575 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6wfs\" (UniqueName: \"kubernetes.io/projected/e4045402-271a-4cb5-9656-fb86ea257eb3-kube-api-access-d6wfs\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2x6w\" (UID: \"e4045402-271a-4cb5-9656-fb86ea257eb3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472579 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w8gt5"] Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472616 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0171f176-2943-4c53-b1f5-03bd6fec2a01-etcd-client\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472644 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-stats-auth\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472671 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-auth-proxy-config\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472721 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0171f176-2943-4c53-b1f5-03bd6fec2a01-encryption-config\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472761 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c457951c-652c-4d1a-8478-507cfe00cd41-audit-dir\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472781 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-service-ca\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472787 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472839 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvvcs\" (UniqueName: \"kubernetes.io/projected/3f9090da-5bf9-4966-b42e-63de136da026-kube-api-access-mvvcs\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472885 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-config\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472903 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79983c07-34c7-4539-958e-73168bfa669d-serving-cert\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472921 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472936 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq7hx\" (UniqueName: \"kubernetes.io/projected/627a6345-6c21-4e20-bba4-5e9a30d2cb86-kube-api-access-mq7hx\") pod \"openshift-controller-manager-operator-756b6f6bc6-2wcz4\" (UID: \"627a6345-6c21-4e20-bba4-5e9a30d2cb86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472953 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-client-ca\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472970 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9cbe0f4-d9df-4be5-a1b8-4c271ce24649-serving-cert\") pod \"openshift-config-operator-7777fb866f-fr5gw\" (UID: \"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.472991 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-config\") pod \"service-ca-operator-777779d784-9p2lr\" (UID: \"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473009 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06fa2713-32ed-4992-8024-48cfb318926d-config\") pod \"kube-apiserver-operator-766d6c64bb-nz6tt\" (UID: \"06fa2713-32ed-4992-8024-48cfb318926d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473027 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-srv-cert\") pod \"catalog-operator-68c6474976-4f25z\" (UID: \"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473046 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30cecc00-7a51-4ab5-b23e-399221fd73d8-service-ca-bundle\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473062 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d97d4b84-731c-4def-b9a9-a436e63f6f67-machine-approver-tls\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473077 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f9090da-5bf9-4966-b42e-63de136da026-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473096 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmfbg\" (UniqueName: \"kubernetes.io/projected/91a5aab3-40d9-437c-9332-8099a6a2380b-kube-api-access-pmfbg\") pod \"multus-admission-controller-857f4d67dd-kptkr\" (UID: \"91a5aab3-40d9-437c-9332-8099a6a2380b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473117 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spg7s\" (UniqueName: \"kubernetes.io/projected/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-kube-api-access-spg7s\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473133 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d97d4b84-731c-4def-b9a9-a436e63f6f67-config\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473149 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/79983c07-34c7-4539-958e-73168bfa669d-etcd-ca\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473164 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvg4z\" (UniqueName: \"kubernetes.io/projected/79983c07-34c7-4539-958e-73168bfa669d-kube-api-access-zvg4z\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473182 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473198 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-metrics-certs\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473215 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-service-ca-bundle\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473216 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-client-ca\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473231 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da17a35e-90ac-4507-bc4f-b901f91051fe-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndv9m\" (UID: \"da17a35e-90ac-4507-bc4f-b901f91051fe\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473257 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0171f176-2943-4c53-b1f5-03bd6fec2a01-serving-cert\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473275 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14799933-c7a1-4fd1-9ab2-7d2d9bfb645d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xrdhr\" (UID: \"14799933-c7a1-4fd1-9ab2-7d2d9bfb645d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473297 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6e4efa-abe1-44da-9528-73fe113e016a-config\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473316 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0171f176-2943-4c53-b1f5-03bd6fec2a01-node-pullsecrets\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473332 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-proxy-tls\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473349 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jp42\" (UniqueName: \"kubernetes.io/projected/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-kube-api-access-4jp42\") pod \"service-ca-operator-777779d784-9p2lr\" (UID: \"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473367 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30cecc00-7a51-4ab5-b23e-399221fd73d8-config\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473383 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30cecc00-7a51-4ab5-b23e-399221fd73d8-serving-cert\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473398 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5de67fc-ba65-4752-b10a-86149771384a-serving-cert\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473414 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/91a5aab3-40d9-437c-9332-8099a6a2380b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kptkr\" (UID: \"91a5aab3-40d9-437c-9332-8099a6a2380b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473431 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdx24\" (UniqueName: \"kubernetes.io/projected/fcc11b1f-45ba-455f-a805-781f5512ebd2-kube-api-access-fdx24\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473450 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-dir\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473468 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7pd7\" (UniqueName: \"kubernetes.io/projected/4a6e4efa-abe1-44da-9528-73fe113e016a-kube-api-access-x7pd7\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473476 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627a6345-6c21-4e20-bba4-5e9a30d2cb86-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2wcz4\" (UID: \"627a6345-6c21-4e20-bba4-5e9a30d2cb86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473487 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473505 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473524 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-console-config\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473538 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/239825af-9f47-45a4-933c-e60a3f4f54ad-trusted-ca\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473567 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhpj4\" (UniqueName: \"kubernetes.io/projected/30cecc00-7a51-4ab5-b23e-399221fd73d8-kube-api-access-zhpj4\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473582 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-config\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473598 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcc11b1f-45ba-455f-a805-781f5512ebd2-apiservice-cert\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473613 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-serving-cert\") pod \"service-ca-operator-777779d784-9p2lr\" (UID: \"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473634 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-audit\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473662 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c457951c-652c-4d1a-8478-507cfe00cd41-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473679 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55929ace-2800-4b03-a017-d8aad4949e5b-trusted-ca\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473713 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-config\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473731 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5d2w\" (UniqueName: \"kubernetes.io/projected/02495680-6e6d-4771-9cb8-8e3324e7c1c2-kube-api-access-k5d2w\") pod \"openshift-apiserver-operator-796bbdcf4f-79fvk\" (UID: \"02495680-6e6d-4771-9cb8-8e3324e7c1c2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473753 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb9vw\" (UniqueName: \"kubernetes.io/projected/00b1bde8-f54b-4f0a-af8b-3b2e95066b05-kube-api-access-lb9vw\") pod \"dns-operator-744455d44c-68b26\" (UID: \"00b1bde8-f54b-4f0a-af8b-3b2e95066b05\") " pod="openshift-dns-operator/dns-operator-744455d44c-68b26" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473770 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473788 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55929ace-2800-4b03-a017-d8aad4949e5b-serving-cert\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473815 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4045402-271a-4cb5-9656-fb86ea257eb3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2x6w\" (UID: \"e4045402-271a-4cb5-9656-fb86ea257eb3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473833 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg7g7\" (UniqueName: \"kubernetes.io/projected/b5de67fc-ba65-4752-b10a-86149771384a-kube-api-access-hg7g7\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473479 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02495680-6e6d-4771-9cb8-8e3324e7c1c2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-79fvk\" (UID: \"02495680-6e6d-4771-9cb8-8e3324e7c1c2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473850 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-images\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473970 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c457951c-652c-4d1a-8478-507cfe00cd41-etcd-client\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474009 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bz8q\" (UniqueName: \"kubernetes.io/projected/a9cbe0f4-d9df-4be5-a1b8-4c271ce24649-kube-api-access-5bz8q\") pod \"openshift-config-operator-7777fb866f-fr5gw\" (UID: \"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474043 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c457951c-652c-4d1a-8478-507cfe00cd41-audit-policies\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474073 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-oauth-config\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474099 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a6e4efa-abe1-44da-9528-73fe113e016a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474131 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-profile-collector-cert\") pod \"catalog-operator-68c6474976-4f25z\" (UID: \"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474170 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24s2d\" (UniqueName: \"kubernetes.io/projected/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-kube-api-access-24s2d\") pod \"catalog-operator-68c6474976-4f25z\" (UID: \"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474195 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da17a35e-90ac-4507-bc4f-b901f91051fe-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndv9m\" (UID: \"da17a35e-90ac-4507-bc4f-b901f91051fe\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474269 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14799933-c7a1-4fd1-9ab2-7d2d9bfb645d-config\") pod \"kube-controller-manager-operator-78b949d7b-xrdhr\" (UID: \"14799933-c7a1-4fd1-9ab2-7d2d9bfb645d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474299 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474322 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a9cbe0f4-d9df-4be5-a1b8-4c271ce24649-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fr5gw\" (UID: \"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474336 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f9090da-5bf9-4966-b42e-63de136da026-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474369 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-image-import-ca\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474394 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00b1bde8-f54b-4f0a-af8b-3b2e95066b05-metrics-tls\") pod \"dns-operator-744455d44c-68b26\" (UID: \"00b1bde8-f54b-4f0a-af8b-3b2e95066b05\") " pod="openshift-dns-operator/dns-operator-744455d44c-68b26" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474417 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d97d4b84-731c-4def-b9a9-a436e63f6f67-auth-proxy-config\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474440 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-trusted-ca-bundle\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474467 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhh7n\" (UniqueName: \"kubernetes.io/projected/b1d6d035-9129-459e-b913-95a87f868196-kube-api-access-vhh7n\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474497 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4045402-271a-4cb5-9656-fb86ea257eb3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2x6w\" (UID: \"e4045402-271a-4cb5-9656-fb86ea257eb3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474526 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474557 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-default-certificate\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474563 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c457951c-652c-4d1a-8478-507cfe00cd41-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474585 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/79983c07-34c7-4539-958e-73168bfa669d-etcd-service-ca\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474608 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79983c07-34c7-4539-958e-73168bfa669d-etcd-client\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474634 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c457951c-652c-4d1a-8478-507cfe00cd41-encryption-config\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474657 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf6fn\" (UniqueName: \"kubernetes.io/projected/0171f176-2943-4c53-b1f5-03bd6fec2a01-kube-api-access-wf6fn\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474683 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06fa2713-32ed-4992-8024-48cfb318926d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nz6tt\" (UID: \"06fa2713-32ed-4992-8024-48cfb318926d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474723 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcc11b1f-45ba-455f-a805-781f5512ebd2-webhook-cert\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474747 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30cecc00-7a51-4ab5-b23e-399221fd73d8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474753 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55929ace-2800-4b03-a017-d8aad4949e5b-config\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474772 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-policies\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474798 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/627a6345-6c21-4e20-bba4-5e9a30d2cb86-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2wcz4\" (UID: \"627a6345-6c21-4e20-bba4-5e9a30d2cb86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474828 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-oauth-serving-cert\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474850 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/239825af-9f47-45a4-933c-e60a3f4f54ad-metrics-tls\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474871 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474896 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474918 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-serving-cert\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474939 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4a6e4efa-abe1-44da-9528-73fe113e016a-images\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474965 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da17a35e-90ac-4507-bc4f-b901f91051fe-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndv9m\" (UID: \"da17a35e-90ac-4507-bc4f-b901f91051fe\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474990 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6hq8\" (UniqueName: \"kubernetes.io/projected/239825af-9f47-45a4-933c-e60a3f4f54ad-kube-api-access-x6hq8\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.475013 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w89fl\" (UniqueName: \"kubernetes.io/projected/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-kube-api-access-w89fl\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.475040 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c457951c-652c-4d1a-8478-507cfe00cd41-serving-cert\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.475062 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w7st\" (UniqueName: \"kubernetes.io/projected/f250df39-ff33-455c-9edc-cb1997a8c782-kube-api-access-5w7st\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.475087 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-etcd-serving-ca\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.475110 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwtqv\" (UniqueName: \"kubernetes.io/projected/861c6ec8-def2-40b1-8b82-c99e683232ec-kube-api-access-hwtqv\") pod \"migrator-59844c95c7-c5lt8\" (UID: \"861c6ec8-def2-40b1-8b82-c99e683232ec\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.475133 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.476019 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02495680-6e6d-4771-9cb8-8e3324e7c1c2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-79fvk\" (UID: \"02495680-6e6d-4771-9cb8-8e3324e7c1c2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.476180 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-config\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.476392 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.477113 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.474435 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.477561 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c457951c-652c-4d1a-8478-507cfe00cd41-audit-policies\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.473611 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.478013 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c457951c-652c-4d1a-8478-507cfe00cd41-etcd-client\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.478228 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0171f176-2943-4c53-b1f5-03bd6fec2a01-encryption-config\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.478440 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0171f176-2943-4c53-b1f5-03bd6fec2a01-etcd-client\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.478568 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f9090da-5bf9-4966-b42e-63de136da026-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.478733 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1d6d035-9129-459e-b913-95a87f868196-serving-cert\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.479338 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.480504 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-oauth-config\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.488317 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.492269 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30cecc00-7a51-4ab5-b23e-399221fd73d8-config\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.492514 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-config\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.492639 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30cecc00-7a51-4ab5-b23e-399221fd73d8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.492827 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-client-ca\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.492856 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-dir\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.493178 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4a6e4efa-abe1-44da-9528-73fe113e016a-images\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.493182 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c457951c-652c-4d1a-8478-507cfe00cd41-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.493555 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-oauth-serving-cert\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.493608 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-audit\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.493976 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.494172 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/239825af-9f47-45a4-933c-e60a3f4f54ad-trusted-ca\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.494464 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55929ace-2800-4b03-a017-d8aad4949e5b-trusted-ca\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.494753 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.495180 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-policies\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.495338 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-console-config\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.495533 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.495573 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c457951c-652c-4d1a-8478-507cfe00cd41-audit-dir\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.495961 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.496555 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c457951c-652c-4d1a-8478-507cfe00cd41-serving-cert\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.497006 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/239825af-9f47-45a4-933c-e60a3f4f54ad-metrics-tls\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.497150 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30cecc00-7a51-4ab5-b23e-399221fd73d8-service-ca-bundle\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.497395 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-serving-cert\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.497455 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-etcd-serving-ca\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.498042 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d97d4b84-731c-4def-b9a9-a436e63f6f67-auth-proxy-config\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.498073 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.498300 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.498411 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-trusted-ca-bundle\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.498474 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a9cbe0f4-d9df-4be5-a1b8-4c271ce24649-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fr5gw\" (UID: \"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.498653 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6e4efa-abe1-44da-9528-73fe113e016a-config\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.498760 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0171f176-2943-4c53-b1f5-03bd6fec2a01-node-pullsecrets\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.499110 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d97d4b84-731c-4def-b9a9-a436e63f6f67-config\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.499453 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0171f176-2943-4c53-b1f5-03bd6fec2a01-image-import-ca\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.499480 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55929ace-2800-4b03-a017-d8aad4949e5b-serving-cert\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.499682 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-config\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.500444 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d97d4b84-731c-4def-b9a9-a436e63f6f67-machine-approver-tls\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.500948 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06fa2713-32ed-4992-8024-48cfb318926d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nz6tt\" (UID: \"06fa2713-32ed-4992-8024-48cfb318926d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.502422 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5de67fc-ba65-4752-b10a-86149771384a-serving-cert\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.502484 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30cecc00-7a51-4ab5-b23e-399221fd73d8-serving-cert\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.503316 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0171f176-2943-4c53-b1f5-03bd6fec2a01-serving-cert\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.503799 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9cbe0f4-d9df-4be5-a1b8-4c271ce24649-serving-cert\") pod \"openshift-config-operator-7777fb866f-fr5gw\" (UID: \"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.504088 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c457951c-652c-4d1a-8478-507cfe00cd41-encryption-config\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.504118 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.504093 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/627a6345-6c21-4e20-bba4-5e9a30d2cb86-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2wcz4\" (UID: \"627a6345-6c21-4e20-bba4-5e9a30d2cb86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.504270 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a6e4efa-abe1-44da-9528-73fe113e016a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.504357 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.507748 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.517796 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06fa2713-32ed-4992-8024-48cfb318926d-config\") pod \"kube-apiserver-operator-766d6c64bb-nz6tt\" (UID: \"06fa2713-32ed-4992-8024-48cfb318926d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.527360 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.547786 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.557077 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-stats-auth\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.567667 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.569384 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-service-ca-bundle\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.576574 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da17a35e-90ac-4507-bc4f-b901f91051fe-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndv9m\" (UID: \"da17a35e-90ac-4507-bc4f-b901f91051fe\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.576831 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w89fl\" (UniqueName: \"kubernetes.io/projected/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-kube-api-access-w89fl\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.577049 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fcc11b1f-45ba-455f-a805-781f5512ebd2-tmpfs\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.577218 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6wfs\" (UniqueName: \"kubernetes.io/projected/e4045402-271a-4cb5-9656-fb86ea257eb3-kube-api-access-d6wfs\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2x6w\" (UID: \"e4045402-271a-4cb5-9656-fb86ea257eb3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.577329 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-auth-proxy-config\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.577443 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-config\") pod \"service-ca-operator-777779d784-9p2lr\" (UID: \"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.577487 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fcc11b1f-45ba-455f-a805-781f5512ebd2-tmpfs\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.577516 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-srv-cert\") pod \"catalog-operator-68c6474976-4f25z\" (UID: \"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.577760 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmfbg\" (UniqueName: \"kubernetes.io/projected/91a5aab3-40d9-437c-9332-8099a6a2380b-kube-api-access-pmfbg\") pod \"multus-admission-controller-857f4d67dd-kptkr\" (UID: \"91a5aab3-40d9-437c-9332-8099a6a2380b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.577934 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-auth-proxy-config\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.577937 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da17a35e-90ac-4507-bc4f-b901f91051fe-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndv9m\" (UID: \"da17a35e-90ac-4507-bc4f-b901f91051fe\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578003 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-proxy-tls\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578023 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jp42\" (UniqueName: \"kubernetes.io/projected/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-kube-api-access-4jp42\") pod \"service-ca-operator-777779d784-9p2lr\" (UID: \"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578048 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/91a5aab3-40d9-437c-9332-8099a6a2380b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kptkr\" (UID: \"91a5aab3-40d9-437c-9332-8099a6a2380b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578064 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdx24\" (UniqueName: \"kubernetes.io/projected/fcc11b1f-45ba-455f-a805-781f5512ebd2-kube-api-access-fdx24\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578089 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcc11b1f-45ba-455f-a805-781f5512ebd2-apiservice-cert\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578113 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-serving-cert\") pod \"service-ca-operator-777779d784-9p2lr\" (UID: \"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578158 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4045402-271a-4cb5-9656-fb86ea257eb3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2x6w\" (UID: \"e4045402-271a-4cb5-9656-fb86ea257eb3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578181 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-images\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578221 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-profile-collector-cert\") pod \"catalog-operator-68c6474976-4f25z\" (UID: \"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578244 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24s2d\" (UniqueName: \"kubernetes.io/projected/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-kube-api-access-24s2d\") pod \"catalog-operator-68c6474976-4f25z\" (UID: \"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578260 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da17a35e-90ac-4507-bc4f-b901f91051fe-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndv9m\" (UID: \"da17a35e-90ac-4507-bc4f-b901f91051fe\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578314 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4045402-271a-4cb5-9656-fb86ea257eb3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2x6w\" (UID: \"e4045402-271a-4cb5-9656-fb86ea257eb3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.578353 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcc11b1f-45ba-455f-a805-781f5512ebd2-webhook-cert\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.587792 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.595380 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-default-certificate\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.607571 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.628468 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.642256 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-metrics-certs\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.647825 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.672040 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.687950 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.701036 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14799933-c7a1-4fd1-9ab2-7d2d9bfb645d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xrdhr\" (UID: \"14799933-c7a1-4fd1-9ab2-7d2d9bfb645d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.708390 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.709276 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14799933-c7a1-4fd1-9ab2-7d2d9bfb645d-config\") pod \"kube-controller-manager-operator-78b949d7b-xrdhr\" (UID: \"14799933-c7a1-4fd1-9ab2-7d2d9bfb645d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.728752 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.752816 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.768014 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.787398 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.806987 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.827722 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.849008 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.868503 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.888985 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.894433 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00b1bde8-f54b-4f0a-af8b-3b2e95066b05-metrics-tls\") pod \"dns-operator-744455d44c-68b26\" (UID: \"00b1bde8-f54b-4f0a-af8b-3b2e95066b05\") " pod="openshift-dns-operator/dns-operator-744455d44c-68b26" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.927839 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.935365 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79983c07-34c7-4539-958e-73168bfa669d-config\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.947982 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.958583 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79983c07-34c7-4539-958e-73168bfa669d-serving-cert\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.967964 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.977977 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79983c07-34c7-4539-958e-73168bfa669d-etcd-client\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:19 crc kubenswrapper[4848]: I1206 15:30:19.988091 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.008329 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.027839 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.034017 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/79983c07-34c7-4539-958e-73168bfa669d-etcd-service-ca\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.047143 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.049913 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/79983c07-34c7-4539-958e-73168bfa669d-etcd-ca\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.072797 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.088344 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.108259 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.128674 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.142559 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da17a35e-90ac-4507-bc4f-b901f91051fe-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndv9m\" (UID: \"da17a35e-90ac-4507-bc4f-b901f91051fe\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.147654 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.158135 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da17a35e-90ac-4507-bc4f-b901f91051fe-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndv9m\" (UID: \"da17a35e-90ac-4507-bc4f-b901f91051fe\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.168823 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.188123 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.228805 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.235211 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bghdr\" (UniqueName: \"kubernetes.io/projected/6c86a6b9-9082-4011-80ae-1e145c2580c1-kube-api-access-bghdr\") pod \"cluster-samples-operator-665b6dd947-nrf55\" (UID: \"6c86a6b9-9082-4011-80ae-1e145c2580c1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.242406 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4045402-271a-4cb5-9656-fb86ea257eb3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2x6w\" (UID: \"e4045402-271a-4cb5-9656-fb86ea257eb3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.247662 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.268478 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.288404 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.289319 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4045402-271a-4cb5-9656-fb86ea257eb3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2x6w\" (UID: \"e4045402-271a-4cb5-9656-fb86ea257eb3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.308048 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.328217 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.348337 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.368296 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.371731 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcc11b1f-45ba-455f-a805-781f5512ebd2-webhook-cert\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.382179 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcc11b1f-45ba-455f-a805-781f5512ebd2-apiservice-cert\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.386912 4848 request.go:700] Waited for 1.013315798s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.389078 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.407839 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.423116 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.428061 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.447855 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.451969 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/91a5aab3-40d9-437c-9332-8099a6a2380b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kptkr\" (UID: \"91a5aab3-40d9-437c-9332-8099a6a2380b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.467914 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.488142 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.489484 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-images\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.508164 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.531817 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.542242 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-proxy-tls\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.548319 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.568113 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 06 15:30:20 crc kubenswrapper[4848]: E1206 15:30:20.578370 4848 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 06 15:30:20 crc kubenswrapper[4848]: E1206 15:30:20.578451 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-srv-cert podName:cbe6fc6e-d376-42f6-af93-04efcd6aa6f7 nodeName:}" failed. No retries permitted until 2025-12-06 15:30:21.078435027 +0000 UTC m=+88.376445940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-srv-cert") pod "catalog-operator-68c6474976-4f25z" (UID: "cbe6fc6e-d376-42f6-af93-04efcd6aa6f7") : failed to sync secret cache: timed out waiting for the condition Dec 06 15:30:20 crc kubenswrapper[4848]: E1206 15:30:20.578472 4848 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 06 15:30:20 crc kubenswrapper[4848]: E1206 15:30:20.578480 4848 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 06 15:30:20 crc kubenswrapper[4848]: E1206 15:30:20.578542 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-serving-cert podName:6cb3f509-a2b5-4471-8e6c-d87f7d5289d2 nodeName:}" failed. No retries permitted until 2025-12-06 15:30:21.078522849 +0000 UTC m=+88.376533752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-serving-cert") pod "service-ca-operator-777779d784-9p2lr" (UID: "6cb3f509-a2b5-4471-8e6c-d87f7d5289d2") : failed to sync secret cache: timed out waiting for the condition Dec 06 15:30:20 crc kubenswrapper[4848]: E1206 15:30:20.578584 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-config podName:6cb3f509-a2b5-4471-8e6c-d87f7d5289d2 nodeName:}" failed. No retries permitted until 2025-12-06 15:30:21.07855104 +0000 UTC m=+88.376562003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-config") pod "service-ca-operator-777779d784-9p2lr" (UID: "6cb3f509-a2b5-4471-8e6c-d87f7d5289d2") : failed to sync configmap cache: timed out waiting for the condition Dec 06 15:30:20 crc kubenswrapper[4848]: E1206 15:30:20.578730 4848 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Dec 06 15:30:20 crc kubenswrapper[4848]: E1206 15:30:20.578787 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-profile-collector-cert podName:cbe6fc6e-d376-42f6-af93-04efcd6aa6f7 nodeName:}" failed. No retries permitted until 2025-12-06 15:30:21.078751427 +0000 UTC m=+88.376762340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-profile-collector-cert") pod "catalog-operator-68c6474976-4f25z" (UID: "cbe6fc6e-d376-42f6-af93-04efcd6aa6f7") : failed to sync secret cache: timed out waiting for the condition Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.587390 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.614178 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.627733 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.631836 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55"] Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.648763 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.668229 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.687828 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.708230 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.728173 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.748128 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.767762 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.787508 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.808230 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.829319 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.848622 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.868318 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.889834 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.909369 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.928896 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.952384 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.966357 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.966446 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.966528 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.966689 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.968023 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 06 15:30:20 crc kubenswrapper[4848]: I1206 15:30:20.988350 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.028476 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.047492 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.068308 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.088469 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.098214 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-serving-cert\") pod \"service-ca-operator-777779d784-9p2lr\" (UID: \"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.098885 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-profile-collector-cert\") pod \"catalog-operator-68c6474976-4f25z\" (UID: \"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.099096 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-config\") pod \"service-ca-operator-777779d784-9p2lr\" (UID: \"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.099119 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-srv-cert\") pod \"catalog-operator-68c6474976-4f25z\" (UID: \"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.099767 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-config\") pod \"service-ca-operator-777779d784-9p2lr\" (UID: \"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.101953 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-serving-cert\") pod \"service-ca-operator-777779d784-9p2lr\" (UID: \"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.102679 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-profile-collector-cert\") pod \"catalog-operator-68c6474976-4f25z\" (UID: \"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.104268 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-srv-cert\") pod \"catalog-operator-68c6474976-4f25z\" (UID: \"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.111034 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.127587 4848 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.148114 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.168156 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.188422 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.208561 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.228842 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.248102 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.267354 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.301292 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/239825af-9f47-45a4-933c-e60a3f4f54ad-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.319208 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwg85\" (UniqueName: \"kubernetes.io/projected/e5fee749-2cc4-41ea-9b22-7499624ae892-kube-api-access-xwg85\") pod \"oauth-openshift-558db77b4-4nmrw\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.339145 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06fa2713-32ed-4992-8024-48cfb318926d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nz6tt\" (UID: \"06fa2713-32ed-4992-8024-48cfb318926d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.363633 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gqz8\" (UniqueName: \"kubernetes.io/projected/d97d4b84-731c-4def-b9a9-a436e63f6f67-kube-api-access-2gqz8\") pod \"machine-approver-56656f9798-6kl6l\" (UID: \"d97d4b84-731c-4def-b9a9-a436e63f6f67\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.380571 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s85d6\" (UniqueName: \"kubernetes.io/projected/55929ace-2800-4b03-a017-d8aad4949e5b-kube-api-access-s85d6\") pod \"console-operator-58897d9998-nhnnj\" (UID: \"55929ace-2800-4b03-a017-d8aad4949e5b\") " pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.400652 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14799933-c7a1-4fd1-9ab2-7d2d9bfb645d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xrdhr\" (UID: \"14799933-c7a1-4fd1-9ab2-7d2d9bfb645d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.407797 4848 request.go:700] Waited for 1.93391285s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/serviceaccounts/oauth-apiserver-sa/token Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.421200 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n5sm\" (UniqueName: \"kubernetes.io/projected/c457951c-652c-4d1a-8478-507cfe00cd41-kube-api-access-4n5sm\") pod \"apiserver-7bbb656c7d-kn6kp\" (UID: \"c457951c-652c-4d1a-8478-507cfe00cd41\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.439295 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvvcs\" (UniqueName: \"kubernetes.io/projected/3f9090da-5bf9-4966-b42e-63de136da026-kube-api-access-mvvcs\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.447604 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.461728 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.478971 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bz8q\" (UniqueName: \"kubernetes.io/projected/a9cbe0f4-d9df-4be5-a1b8-4c271ce24649-kube-api-access-5bz8q\") pod \"openshift-config-operator-7777fb866f-fr5gw\" (UID: \"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.484240 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" event={"ID":"6c86a6b9-9082-4011-80ae-1e145c2580c1","Type":"ContainerStarted","Data":"74c0f14c5cd417ef022b614cfb3f93540c7f45edcdab1763e92ca61968194622"} Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.484292 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" event={"ID":"6c86a6b9-9082-4011-80ae-1e145c2580c1","Type":"ContainerStarted","Data":"05d0991f591b66c612fb24df0b3ef08ce20ba221b763712d0ac16045a1d15f93"} Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.484307 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" event={"ID":"6c86a6b9-9082-4011-80ae-1e145c2580c1","Type":"ContainerStarted","Data":"10fe6bda5b57671aa7a2207cd439605c96115dd6207b3a1c42fe5a209b70d356"} Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.487455 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq7hx\" (UniqueName: \"kubernetes.io/projected/627a6345-6c21-4e20-bba4-5e9a30d2cb86-kube-api-access-mq7hx\") pod \"openshift-controller-manager-operator-756b6f6bc6-2wcz4\" (UID: \"627a6345-6c21-4e20-bba4-5e9a30d2cb86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.494386 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.502026 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.504444 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5d2w\" (UniqueName: \"kubernetes.io/projected/02495680-6e6d-4771-9cb8-8e3324e7c1c2-kube-api-access-k5d2w\") pod \"openshift-apiserver-operator-796bbdcf4f-79fvk\" (UID: \"02495680-6e6d-4771-9cb8-8e3324e7c1c2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.522066 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwtqv\" (UniqueName: \"kubernetes.io/projected/861c6ec8-def2-40b1-8b82-c99e683232ec-kube-api-access-hwtqv\") pod \"migrator-59844c95c7-c5lt8\" (UID: \"861c6ec8-def2-40b1-8b82-c99e683232ec\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.537643 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.545901 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.546318 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w7st\" (UniqueName: \"kubernetes.io/projected/f250df39-ff33-455c-9edc-cb1997a8c782-kube-api-access-5w7st\") pod \"console-f9d7485db-mdg75\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.551443 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.561314 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf6fn\" (UniqueName: \"kubernetes.io/projected/0171f176-2943-4c53-b1f5-03bd6fec2a01-kube-api-access-wf6fn\") pod \"apiserver-76f77b778f-cjw6f\" (UID: \"0171f176-2943-4c53-b1f5-03bd6fec2a01\") " pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.577290 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.582930 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhpj4\" (UniqueName: \"kubernetes.io/projected/30cecc00-7a51-4ab5-b23e-399221fd73d8-kube-api-access-zhpj4\") pod \"authentication-operator-69f744f599-65225\" (UID: \"30cecc00-7a51-4ab5-b23e-399221fd73d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.611217 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6hq8\" (UniqueName: \"kubernetes.io/projected/239825af-9f47-45a4-933c-e60a3f4f54ad-kube-api-access-x6hq8\") pod \"ingress-operator-5b745b69d9-rk9hv\" (UID: \"239825af-9f47-45a4-933c-e60a3f4f54ad\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.625186 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spg7s\" (UniqueName: \"kubernetes.io/projected/ca40b01e-331c-4d2e-908a-f25b7b7b40e9-kube-api-access-spg7s\") pod \"router-default-5444994796-szkmh\" (UID: \"ca40b01e-331c-4d2e-908a-f25b7b7b40e9\") " pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.632461 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.640107 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.642752 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg7g7\" (UniqueName: \"kubernetes.io/projected/b5de67fc-ba65-4752-b10a-86149771384a-kube-api-access-hg7g7\") pod \"controller-manager-879f6c89f-8jq72\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.648883 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.663288 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.665136 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f9090da-5bf9-4966-b42e-63de136da026-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qvx45\" (UID: \"3f9090da-5bf9-4966-b42e-63de136da026\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.675733 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.694005 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb9vw\" (UniqueName: \"kubernetes.io/projected/00b1bde8-f54b-4f0a-af8b-3b2e95066b05-kube-api-access-lb9vw\") pod \"dns-operator-744455d44c-68b26\" (UID: \"00b1bde8-f54b-4f0a-af8b-3b2e95066b05\") " pod="openshift-dns-operator/dns-operator-744455d44c-68b26" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.703426 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhh7n\" (UniqueName: \"kubernetes.io/projected/b1d6d035-9129-459e-b913-95a87f868196-kube-api-access-vhh7n\") pod \"route-controller-manager-6576b87f9c-48n2s\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.704893 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nhnnj"] Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.722402 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w69z7\" (UniqueName: \"kubernetes.io/projected/1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a-kube-api-access-w69z7\") pod \"downloads-7954f5f757-ftn2g\" (UID: \"1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a\") " pod="openshift-console/downloads-7954f5f757-ftn2g" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.735293 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.744250 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7pd7\" (UniqueName: \"kubernetes.io/projected/4a6e4efa-abe1-44da-9528-73fe113e016a-kube-api-access-x7pd7\") pod \"machine-api-operator-5694c8668f-7h7ps\" (UID: \"4a6e4efa-abe1-44da-9528-73fe113e016a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.765423 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvg4z\" (UniqueName: \"kubernetes.io/projected/79983c07-34c7-4539-958e-73168bfa669d-kube-api-access-zvg4z\") pod \"etcd-operator-b45778765-bksdf\" (UID: \"79983c07-34c7-4539-958e-73168bfa669d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.773552 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.799295 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w89fl\" (UniqueName: \"kubernetes.io/projected/ee5ab1d8-ba27-41de-a7e6-aed8c51b45af-kube-api-access-w89fl\") pod \"machine-config-operator-74547568cd-f2c5q\" (UID: \"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.818604 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.819183 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4"] Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.822506 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ftn2g" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.826851 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6wfs\" (UniqueName: \"kubernetes.io/projected/e4045402-271a-4cb5-9656-fb86ea257eb3-kube-api-access-d6wfs\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2x6w\" (UID: \"e4045402-271a-4cb5-9656-fb86ea257eb3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.827254 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmfbg\" (UniqueName: \"kubernetes.io/projected/91a5aab3-40d9-437c-9332-8099a6a2380b-kube-api-access-pmfbg\") pod \"multus-admission-controller-857f4d67dd-kptkr\" (UID: \"91a5aab3-40d9-437c-9332-8099a6a2380b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.831491 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.852003 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mdg75"] Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.852166 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jp42\" (UniqueName: \"kubernetes.io/projected/6cb3f509-a2b5-4471-8e6c-d87f7d5289d2-kube-api-access-4jp42\") pod \"service-ca-operator-777779d784-9p2lr\" (UID: \"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.862886 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.885293 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdx24\" (UniqueName: \"kubernetes.io/projected/fcc11b1f-45ba-455f-a805-781f5512ebd2-kube-api-access-fdx24\") pod \"packageserver-d55dfcdfc-l7mnj\" (UID: \"fcc11b1f-45ba-455f-a805-781f5512ebd2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.896130 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da17a35e-90ac-4507-bc4f-b901f91051fe-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndv9m\" (UID: \"da17a35e-90ac-4507-bc4f-b901f91051fe\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.902249 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24s2d\" (UniqueName: \"kubernetes.io/projected/cbe6fc6e-d376-42f6-af93-04efcd6aa6f7-kube-api-access-24s2d\") pod \"catalog-operator-68c6474976-4f25z\" (UID: \"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.916765 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.916745361 podStartE2EDuration="4.916745361s" podCreationTimestamp="2025-12-06 15:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:21.915054153 +0000 UTC m=+89.213065076" watchObservedRunningTime="2025-12-06 15:30:21.916745361 +0000 UTC m=+89.214756274" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.928942 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.936185 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4nmrw"] Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.939328 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp"] Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.948537 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.955110 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-68b26" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.971614 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.971778 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.981968 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.988800 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 06 15:30:21 crc kubenswrapper[4848]: I1206 15:30:21.997874 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.006103 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.007199 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.033045 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.033073 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.033189 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.038838 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.047149 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.069326 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.073679 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.112673 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.122443 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nrf55" podStartSLOduration=71.122419556 podStartE2EDuration="1m11.122419556s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:22.106035683 +0000 UTC m=+89.404046596" watchObservedRunningTime="2025-12-06 15:30:22.122419556 +0000 UTC m=+89.420430469" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.122690 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135405 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwmlb\" (UniqueName: \"kubernetes.io/projected/55109807-aa20-4021-8a9f-f40b4c91c2df-kube-api-access-hwmlb\") pod \"collect-profiles-29417250-tqwpk\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135469 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5gkxw\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135520 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d7qh\" (UniqueName: \"kubernetes.io/projected/4e096d6f-01b9-44a6-ad86-ca6898ffed4e-kube-api-access-5d7qh\") pod \"olm-operator-6b444d44fb-zdp8x\" (UID: \"4e096d6f-01b9-44a6-ad86-ca6898ffed4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135548 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsfk7\" (UniqueName: \"kubernetes.io/projected/ab287f93-4a57-497c-a075-007f697c2bb0-kube-api-access-vsfk7\") pod \"package-server-manager-789f6589d5-5fpmg\" (UID: \"ab287f93-4a57-497c-a075-007f697c2bb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135610 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xg9w\" (UniqueName: \"kubernetes.io/projected/acbec6ea-48e4-4f2b-a5fc-070a06599b01-kube-api-access-8xg9w\") pod \"machine-config-controller-84d6567774-8dnqm\" (UID: \"acbec6ea-48e4-4f2b-a5fc-070a06599b01\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135713 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-certificates\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135792 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/dfa0e198-c523-4da6-93c0-59350986464e-signing-cabundle\") pod \"service-ca-9c57cc56f-nrffz\" (UID: \"dfa0e198-c523-4da6-93c0-59350986464e\") " pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135818 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/204ca83b-b95a-451a-bc43-a46bf0f2859d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6wjr7\" (UID: \"204ca83b-b95a-451a-bc43-a46bf0f2859d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135852 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab287f93-4a57-497c-a075-007f697c2bb0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5fpmg\" (UID: \"ab287f93-4a57-497c-a075-007f697c2bb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135874 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/dfa0e198-c523-4da6-93c0-59350986464e-signing-key\") pod \"service-ca-9c57cc56f-nrffz\" (UID: \"dfa0e198-c523-4da6-93c0-59350986464e\") " pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135907 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135954 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e096d6f-01b9-44a6-ad86-ca6898ffed4e-srv-cert\") pod \"olm-operator-6b444d44fb-zdp8x\" (UID: \"4e096d6f-01b9-44a6-ad86-ca6898ffed4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.135979 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kndnf\" (UniqueName: \"kubernetes.io/projected/6056241e-bb6d-420b-9808-b9b3803a3c2d-kube-api-access-kndnf\") pod \"marketplace-operator-79b997595-5gkxw\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.136000 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e096d6f-01b9-44a6-ad86-ca6898ffed4e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zdp8x\" (UID: \"4e096d6f-01b9-44a6-ad86-ca6898ffed4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.136024 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72bnz\" (UniqueName: \"kubernetes.io/projected/204ca83b-b95a-451a-bc43-a46bf0f2859d-kube-api-access-72bnz\") pod \"control-plane-machine-set-operator-78cbb6b69f-6wjr7\" (UID: \"204ca83b-b95a-451a-bc43-a46bf0f2859d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.136062 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5gkxw\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.136086 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4c84\" (UniqueName: \"kubernetes.io/projected/dfa0e198-c523-4da6-93c0-59350986464e-kube-api-access-k4c84\") pod \"service-ca-9c57cc56f-nrffz\" (UID: \"dfa0e198-c523-4da6-93c0-59350986464e\") " pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.136138 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/acbec6ea-48e4-4f2b-a5fc-070a06599b01-proxy-tls\") pod \"machine-config-controller-84d6567774-8dnqm\" (UID: \"acbec6ea-48e4-4f2b-a5fc-070a06599b01\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.136279 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8e41ef6-75a7-4af2-94b0-14ef0274122a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.136306 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-trusted-ca\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.136328 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xxd\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-kube-api-access-w2xxd\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.136378 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8e41ef6-75a7-4af2-94b0-14ef0274122a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.136402 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55109807-aa20-4021-8a9f-f40b4c91c2df-secret-volume\") pod \"collect-profiles-29417250-tqwpk\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.136429 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-bound-sa-token\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.137113 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55109807-aa20-4021-8a9f-f40b4c91c2df-config-volume\") pod \"collect-profiles-29417250-tqwpk\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.137159 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/acbec6ea-48e4-4f2b-a5fc-070a06599b01-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8dnqm\" (UID: \"acbec6ea-48e4-4f2b-a5fc-070a06599b01\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.137371 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-tls\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.139752 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:22.639735965 +0000 UTC m=+89.937746988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.242618 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.242875 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:22.74285404 +0000 UTC m=+90.040864963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.243126 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xg9w\" (UniqueName: \"kubernetes.io/projected/acbec6ea-48e4-4f2b-a5fc-070a06599b01-kube-api-access-8xg9w\") pod \"machine-config-controller-84d6567774-8dnqm\" (UID: \"acbec6ea-48e4-4f2b-a5fc-070a06599b01\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.243150 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjp9c\" (UniqueName: \"kubernetes.io/projected/b5bf9a89-1638-4bf9-b2a8-41f96e5220d0-kube-api-access-tjp9c\") pod \"dns-default-zq7wp\" (UID: \"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0\") " pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.243205 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-registration-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.243256 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-certificates\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.243280 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/dfa0e198-c523-4da6-93c0-59350986464e-signing-cabundle\") pod \"service-ca-9c57cc56f-nrffz\" (UID: \"dfa0e198-c523-4da6-93c0-59350986464e\") " pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.243296 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5bf9a89-1638-4bf9-b2a8-41f96e5220d0-config-volume\") pod \"dns-default-zq7wp\" (UID: \"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0\") " pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.243312 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/204ca83b-b95a-451a-bc43-a46bf0f2859d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6wjr7\" (UID: \"204ca83b-b95a-451a-bc43-a46bf0f2859d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.243328 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/dfa0e198-c523-4da6-93c0-59350986464e-signing-key\") pod \"service-ca-9c57cc56f-nrffz\" (UID: \"dfa0e198-c523-4da6-93c0-59350986464e\") " pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.243354 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab287f93-4a57-497c-a075-007f697c2bb0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5fpmg\" (UID: \"ab287f93-4a57-497c-a075-007f697c2bb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.245375 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-certificates\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.245425 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.245455 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/acea0ac6-d2f8-4b51-8bce-008f0176f6ce-certs\") pod \"machine-config-server-4hnzh\" (UID: \"acea0ac6-d2f8-4b51-8bce-008f0176f6ce\") " pod="openshift-machine-config-operator/machine-config-server-4hnzh" Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.245743 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:22.745729203 +0000 UTC m=+90.043740116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.246389 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e096d6f-01b9-44a6-ad86-ca6898ffed4e-srv-cert\") pod \"olm-operator-6b444d44fb-zdp8x\" (UID: \"4e096d6f-01b9-44a6-ad86-ca6898ffed4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.246422 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kndnf\" (UniqueName: \"kubernetes.io/projected/6056241e-bb6d-420b-9808-b9b3803a3c2d-kube-api-access-kndnf\") pod \"marketplace-operator-79b997595-5gkxw\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.246448 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e096d6f-01b9-44a6-ad86-ca6898ffed4e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zdp8x\" (UID: \"4e096d6f-01b9-44a6-ad86-ca6898ffed4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.246472 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72bnz\" (UniqueName: \"kubernetes.io/projected/204ca83b-b95a-451a-bc43-a46bf0f2859d-kube-api-access-72bnz\") pod \"control-plane-machine-set-operator-78cbb6b69f-6wjr7\" (UID: \"204ca83b-b95a-451a-bc43-a46bf0f2859d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.246586 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5gkxw\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.246613 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4c84\" (UniqueName: \"kubernetes.io/projected/dfa0e198-c523-4da6-93c0-59350986464e-kube-api-access-k4c84\") pod \"service-ca-9c57cc56f-nrffz\" (UID: \"dfa0e198-c523-4da6-93c0-59350986464e\") " pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.250411 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/dfa0e198-c523-4da6-93c0-59350986464e-signing-key\") pod \"service-ca-9c57cc56f-nrffz\" (UID: \"dfa0e198-c523-4da6-93c0-59350986464e\") " pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.250735 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/acbec6ea-48e4-4f2b-a5fc-070a06599b01-proxy-tls\") pod \"machine-config-controller-84d6567774-8dnqm\" (UID: \"acbec6ea-48e4-4f2b-a5fc-070a06599b01\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.254805 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/204ca83b-b95a-451a-bc43-a46bf0f2859d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6wjr7\" (UID: \"204ca83b-b95a-451a-bc43-a46bf0f2859d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.256708 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4wjw\" (UniqueName: \"kubernetes.io/projected/05a9b88b-e88b-4af7-96bd-8bd051fb9353-kube-api-access-q4wjw\") pod \"ingress-canary-r25sj\" (UID: \"05a9b88b-e88b-4af7-96bd-8bd051fb9353\") " pod="openshift-ingress-canary/ingress-canary-r25sj" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.256777 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8e41ef6-75a7-4af2-94b0-14ef0274122a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.256797 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-trusted-ca\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.256816 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xxd\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-kube-api-access-w2xxd\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.256837 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/acea0ac6-d2f8-4b51-8bce-008f0176f6ce-node-bootstrap-token\") pod \"machine-config-server-4hnzh\" (UID: \"acea0ac6-d2f8-4b51-8bce-008f0176f6ce\") " pod="openshift-machine-config-operator/machine-config-server-4hnzh" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.256884 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8e41ef6-75a7-4af2-94b0-14ef0274122a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.256901 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55109807-aa20-4021-8a9f-f40b4c91c2df-secret-volume\") pod \"collect-profiles-29417250-tqwpk\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.256923 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-65225"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.256945 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-bound-sa-token\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257028 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55109807-aa20-4021-8a9f-f40b4c91c2df-config-volume\") pod \"collect-profiles-29417250-tqwpk\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257082 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/acbec6ea-48e4-4f2b-a5fc-070a06599b01-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8dnqm\" (UID: \"acbec6ea-48e4-4f2b-a5fc-070a06599b01\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257141 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pxhx\" (UniqueName: \"kubernetes.io/projected/9f8b2880-f901-410a-a27e-60691016e54e-kube-api-access-9pxhx\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257226 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-csi-data-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257260 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-tls\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257309 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vjvl\" (UniqueName: \"kubernetes.io/projected/acea0ac6-d2f8-4b51-8bce-008f0176f6ce-kube-api-access-6vjvl\") pod \"machine-config-server-4hnzh\" (UID: \"acea0ac6-d2f8-4b51-8bce-008f0176f6ce\") " pod="openshift-machine-config-operator/machine-config-server-4hnzh" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257361 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-socket-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257380 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5bf9a89-1638-4bf9-b2a8-41f96e5220d0-metrics-tls\") pod \"dns-default-zq7wp\" (UID: \"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0\") " pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257403 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwmlb\" (UniqueName: \"kubernetes.io/projected/55109807-aa20-4021-8a9f-f40b4c91c2df-kube-api-access-hwmlb\") pod \"collect-profiles-29417250-tqwpk\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257455 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-plugins-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257484 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5gkxw\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257598 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-mountpoint-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.257358 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e096d6f-01b9-44a6-ad86-ca6898ffed4e-srv-cert\") pod \"olm-operator-6b444d44fb-zdp8x\" (UID: \"4e096d6f-01b9-44a6-ad86-ca6898ffed4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.258005 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a9b88b-e88b-4af7-96bd-8bd051fb9353-cert\") pod \"ingress-canary-r25sj\" (UID: \"05a9b88b-e88b-4af7-96bd-8bd051fb9353\") " pod="openshift-ingress-canary/ingress-canary-r25sj" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.258042 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d7qh\" (UniqueName: \"kubernetes.io/projected/4e096d6f-01b9-44a6-ad86-ca6898ffed4e-kube-api-access-5d7qh\") pod \"olm-operator-6b444d44fb-zdp8x\" (UID: \"4e096d6f-01b9-44a6-ad86-ca6898ffed4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.258065 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsfk7\" (UniqueName: \"kubernetes.io/projected/ab287f93-4a57-497c-a075-007f697c2bb0-kube-api-access-vsfk7\") pod \"package-server-manager-789f6589d5-5fpmg\" (UID: \"ab287f93-4a57-497c-a075-007f697c2bb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.258805 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e096d6f-01b9-44a6-ad86-ca6898ffed4e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zdp8x\" (UID: \"4e096d6f-01b9-44a6-ad86-ca6898ffed4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.262453 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8e41ef6-75a7-4af2-94b0-14ef0274122a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.266357 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/acbec6ea-48e4-4f2b-a5fc-070a06599b01-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8dnqm\" (UID: \"acbec6ea-48e4-4f2b-a5fc-070a06599b01\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.268082 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5gkxw\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.268914 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-tls\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.269612 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55109807-aa20-4021-8a9f-f40b4c91c2df-config-volume\") pod \"collect-profiles-29417250-tqwpk\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.273008 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55109807-aa20-4021-8a9f-f40b4c91c2df-secret-volume\") pod \"collect-profiles-29417250-tqwpk\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.274203 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-trusted-ca\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.274578 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/acbec6ea-48e4-4f2b-a5fc-070a06599b01-proxy-tls\") pod \"machine-config-controller-84d6567774-8dnqm\" (UID: \"acbec6ea-48e4-4f2b-a5fc-070a06599b01\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.281433 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8e41ef6-75a7-4af2-94b0-14ef0274122a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.281777 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/dfa0e198-c523-4da6-93c0-59350986464e-signing-cabundle\") pod \"service-ca-9c57cc56f-nrffz\" (UID: \"dfa0e198-c523-4da6-93c0-59350986464e\") " pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.282598 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab287f93-4a57-497c-a075-007f697c2bb0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5fpmg\" (UID: \"ab287f93-4a57-497c-a075-007f697c2bb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.284167 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xg9w\" (UniqueName: \"kubernetes.io/projected/acbec6ea-48e4-4f2b-a5fc-070a06599b01-kube-api-access-8xg9w\") pod \"machine-config-controller-84d6567774-8dnqm\" (UID: \"acbec6ea-48e4-4f2b-a5fc-070a06599b01\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.284558 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.288807 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5gkxw\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.306225 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72bnz\" (UniqueName: \"kubernetes.io/projected/204ca83b-b95a-451a-bc43-a46bf0f2859d-kube-api-access-72bnz\") pod \"control-plane-machine-set-operator-78cbb6b69f-6wjr7\" (UID: \"204ca83b-b95a-451a-bc43-a46bf0f2859d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.313048 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.324358 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kndnf\" (UniqueName: \"kubernetes.io/projected/6056241e-bb6d-420b-9808-b9b3803a3c2d-kube-api-access-kndnf\") pod \"marketplace-operator-79b997595-5gkxw\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:22 crc kubenswrapper[4848]: W1206 15:30:22.333625 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod861c6ec8_def2_40b1_8b82_c99e683232ec.slice/crio-385b65262b3252a5f3e7ca3ee62a5cdbcf9406fe1bebcd2e6a530b1effbff812 WatchSource:0}: Error finding container 385b65262b3252a5f3e7ca3ee62a5cdbcf9406fe1bebcd2e6a530b1effbff812: Status 404 returned error can't find the container with id 385b65262b3252a5f3e7ca3ee62a5cdbcf9406fe1bebcd2e6a530b1effbff812 Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.341536 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.347684 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4c84\" (UniqueName: \"kubernetes.io/projected/dfa0e198-c523-4da6-93c0-59350986464e-kube-api-access-k4c84\") pod \"service-ca-9c57cc56f-nrffz\" (UID: \"dfa0e198-c523-4da6-93c0-59350986464e\") " pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.360806 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.361752 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362126 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.362227 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:22.862194184 +0000 UTC m=+90.160205097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362347 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5bf9a89-1638-4bf9-b2a8-41f96e5220d0-metrics-tls\") pod \"dns-default-zq7wp\" (UID: \"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0\") " pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362400 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-plugins-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362430 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-mountpoint-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362447 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a9b88b-e88b-4af7-96bd-8bd051fb9353-cert\") pod \"ingress-canary-r25sj\" (UID: \"05a9b88b-e88b-4af7-96bd-8bd051fb9353\") " pod="openshift-ingress-canary/ingress-canary-r25sj" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362555 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjp9c\" (UniqueName: \"kubernetes.io/projected/b5bf9a89-1638-4bf9-b2a8-41f96e5220d0-kube-api-access-tjp9c\") pod \"dns-default-zq7wp\" (UID: \"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0\") " pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362580 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-registration-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362598 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5bf9a89-1638-4bf9-b2a8-41f96e5220d0-config-volume\") pod \"dns-default-zq7wp\" (UID: \"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0\") " pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362633 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/acea0ac6-d2f8-4b51-8bce-008f0176f6ce-certs\") pod \"machine-config-server-4hnzh\" (UID: \"acea0ac6-d2f8-4b51-8bce-008f0176f6ce\") " pod="openshift-machine-config-operator/machine-config-server-4hnzh" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362657 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362686 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4wjw\" (UniqueName: \"kubernetes.io/projected/05a9b88b-e88b-4af7-96bd-8bd051fb9353-kube-api-access-q4wjw\") pod \"ingress-canary-r25sj\" (UID: \"05a9b88b-e88b-4af7-96bd-8bd051fb9353\") " pod="openshift-ingress-canary/ingress-canary-r25sj" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362738 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/acea0ac6-d2f8-4b51-8bce-008f0176f6ce-node-bootstrap-token\") pod \"machine-config-server-4hnzh\" (UID: \"acea0ac6-d2f8-4b51-8bce-008f0176f6ce\") " pod="openshift-machine-config-operator/machine-config-server-4hnzh" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362882 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pxhx\" (UniqueName: \"kubernetes.io/projected/9f8b2880-f901-410a-a27e-60691016e54e-kube-api-access-9pxhx\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362925 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-csi-data-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.362959 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vjvl\" (UniqueName: \"kubernetes.io/projected/acea0ac6-d2f8-4b51-8bce-008f0176f6ce-kube-api-access-6vjvl\") pod \"machine-config-server-4hnzh\" (UID: \"acea0ac6-d2f8-4b51-8bce-008f0176f6ce\") " pod="openshift-machine-config-operator/machine-config-server-4hnzh" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.363015 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-socket-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.363308 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-socket-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.363378 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-mountpoint-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.363378 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-plugins-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.363479 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-registration-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.363555 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9f8b2880-f901-410a-a27e-60691016e54e-csi-data-dir\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.363670 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:22.863643535 +0000 UTC m=+90.161654548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.364671 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8jq72"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.365026 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5bf9a89-1638-4bf9-b2a8-41f96e5220d0-config-volume\") pod \"dns-default-zq7wp\" (UID: \"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0\") " pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.373136 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a9b88b-e88b-4af7-96bd-8bd051fb9353-cert\") pod \"ingress-canary-r25sj\" (UID: \"05a9b88b-e88b-4af7-96bd-8bd051fb9353\") " pod="openshift-ingress-canary/ingress-canary-r25sj" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.384201 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5bf9a89-1638-4bf9-b2a8-41f96e5220d0-metrics-tls\") pod \"dns-default-zq7wp\" (UID: \"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0\") " pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.384462 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/acea0ac6-d2f8-4b51-8bce-008f0176f6ce-certs\") pod \"machine-config-server-4hnzh\" (UID: \"acea0ac6-d2f8-4b51-8bce-008f0176f6ce\") " pod="openshift-machine-config-operator/machine-config-server-4hnzh" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.386199 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/acea0ac6-d2f8-4b51-8bce-008f0176f6ce-node-bootstrap-token\") pod \"machine-config-server-4hnzh\" (UID: \"acea0ac6-d2f8-4b51-8bce-008f0176f6ce\") " pod="openshift-machine-config-operator/machine-config-server-4hnzh" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.389398 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwmlb\" (UniqueName: \"kubernetes.io/projected/55109807-aa20-4021-8a9f-f40b4c91c2df-kube-api-access-hwmlb\") pod \"collect-profiles-29417250-tqwpk\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.391443 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.393014 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.399826 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xxd\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-kube-api-access-w2xxd\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.417933 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d7qh\" (UniqueName: \"kubernetes.io/projected/4e096d6f-01b9-44a6-ad86-ca6898ffed4e-kube-api-access-5d7qh\") pod \"olm-operator-6b444d44fb-zdp8x\" (UID: \"4e096d6f-01b9-44a6-ad86-ca6898ffed4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.426850 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsfk7\" (UniqueName: \"kubernetes.io/projected/ab287f93-4a57-497c-a075-007f697c2bb0-kube-api-access-vsfk7\") pod \"package-server-manager-789f6589d5-5fpmg\" (UID: \"ab287f93-4a57-497c-a075-007f697c2bb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.454801 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-bound-sa-token\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.463575 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.463935 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:22.963899658 +0000 UTC m=+90.261910571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.464161 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.465839 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:22.964953119 +0000 UTC m=+90.262964032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.469953 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ftn2g"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.491376 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.496283 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjp9c\" (UniqueName: \"kubernetes.io/projected/b5bf9a89-1638-4bf9-b2a8-41f96e5220d0-kube-api-access-tjp9c\") pod \"dns-default-zq7wp\" (UID: \"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0\") " pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.496899 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.508948 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pxhx\" (UniqueName: \"kubernetes.io/projected/9f8b2880-f901-410a-a27e-60691016e54e-kube-api-access-9pxhx\") pod \"csi-hostpathplugin-w8gt5\" (UID: \"9f8b2880-f901-410a-a27e-60691016e54e\") " pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.513072 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8" event={"ID":"861c6ec8-def2-40b1-8b82-c99e683232ec","Type":"ContainerStarted","Data":"385b65262b3252a5f3e7ca3ee62a5cdbcf9406fe1bebcd2e6a530b1effbff812"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.521714 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" event={"ID":"e5fee749-2cc4-41ea-9b22-7499624ae892","Type":"ContainerStarted","Data":"9ece09f732494eeedac8a44356bbfc0fe897e07ca6dc5c1ea560a60ce3c9a57f"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.523246 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" event={"ID":"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649","Type":"ContainerStarted","Data":"a90cd72e5b5b7bdc14e2cc060f90def044417a5cc86de01314992de82e18ba9d"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.523501 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vjvl\" (UniqueName: \"kubernetes.io/projected/acea0ac6-d2f8-4b51-8bce-008f0176f6ce-kube-api-access-6vjvl\") pod \"machine-config-server-4hnzh\" (UID: \"acea0ac6-d2f8-4b51-8bce-008f0176f6ce\") " pod="openshift-machine-config-operator/machine-config-server-4hnzh" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.545625 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-szkmh" event={"ID":"ca40b01e-331c-4d2e-908a-f25b7b7b40e9","Type":"ContainerStarted","Data":"912251712c8072dd2072917d8d27931d446e2cef1f503527b99dc6108e41f25d"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.545676 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-szkmh" event={"ID":"ca40b01e-331c-4d2e-908a-f25b7b7b40e9","Type":"ContainerStarted","Data":"7157e992c63693925cddbae9de116a6d2f94f6fd0073ac4867007894963ba34c"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.558630 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" event={"ID":"06fa2713-32ed-4992-8024-48cfb318926d","Type":"ContainerStarted","Data":"e8fe441272278e003e8161834bfa96cdeaf6e64fbc1059575d990d057bad64ae"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.562141 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" event={"ID":"c457951c-652c-4d1a-8478-507cfe00cd41","Type":"ContainerStarted","Data":"250d77b712738f2c102e51c2a409b84a46f999c35229378428241031e1232a97"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.562634 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4wjw\" (UniqueName: \"kubernetes.io/projected/05a9b88b-e88b-4af7-96bd-8bd051fb9353-kube-api-access-q4wjw\") pod \"ingress-canary-r25sj\" (UID: \"05a9b88b-e88b-4af7-96bd-8bd051fb9353\") " pod="openshift-ingress-canary/ingress-canary-r25sj" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.566453 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.566778 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.066762696 +0000 UTC m=+90.364773599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.567331 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nhnnj" event={"ID":"55929ace-2800-4b03-a017-d8aad4949e5b","Type":"ContainerStarted","Data":"84fa3f00343abf416b2c2e24cbe208906db3649a0c60cb2343a658a9be8cc790"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.567375 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nhnnj" event={"ID":"55929ace-2800-4b03-a017-d8aad4949e5b","Type":"ContainerStarted","Data":"b411ba8762d0ab81152441ffd9517bb15b748010211fd16045a8e053845d990c"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.568307 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.572121 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" event={"ID":"02495680-6e6d-4771-9cb8-8e3324e7c1c2","Type":"ContainerStarted","Data":"29b4ea5aa86f47e0cf00588ad901385611cb244c25cfd64a00f022e88e2ff26f"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.573230 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" event={"ID":"30cecc00-7a51-4ab5-b23e-399221fd73d8","Type":"ContainerStarted","Data":"93445b4cd825b1e6f1a9a6ce3de49ed55ee358492bcd3cbfb5fdfa22d6384d78"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.575190 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" event={"ID":"627a6345-6c21-4e20-bba4-5e9a30d2cb86","Type":"ContainerStarted","Data":"08c3f69ae0e86643e0a83144e70b50647d9c9abe409950305a91e3fd7f1bfe48"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.575242 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" event={"ID":"627a6345-6c21-4e20-bba4-5e9a30d2cb86","Type":"ContainerStarted","Data":"80185f0bae71c2ac2fbe7a6d80a145532afef4cc6a1ea422e507a8e4366fe59c"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.581263 4848 patch_prober.go:28] interesting pod/console-operator-58897d9998-nhnnj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.581313 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nhnnj" podUID="55929ace-2800-4b03-a017-d8aad4949e5b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.585376 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" event={"ID":"d97d4b84-731c-4def-b9a9-a436e63f6f67","Type":"ContainerStarted","Data":"20dc043fa16ad350c59c8ad1b3f51cefe634f6607f508b1e096c95511c235fa1"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.585417 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" event={"ID":"d97d4b84-731c-4def-b9a9-a436e63f6f67","Type":"ContainerStarted","Data":"4ba64e551e68b12a733507f2957628af415794053f9aeb06bb525451c78808ad"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.597180 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mdg75" event={"ID":"f250df39-ff33-455c-9edc-cb1997a8c782","Type":"ContainerStarted","Data":"0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1"} Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.597219 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mdg75" event={"ID":"f250df39-ff33-455c-9edc-cb1997a8c782","Type":"ContainerStarted","Data":"29ef2bfa459ffb9a9cae93c1b05d94618b2f3ab5544c7ee1d52c8d930501739c"} Dec 06 15:30:22 crc kubenswrapper[4848]: W1206 15:30:22.621890 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f9090da_5bf9_4966_b42e_63de136da026.slice/crio-9ef0e8c012022a84b95d031849c331f04d43e00d947687acdd397e718d339491 WatchSource:0}: Error finding container 9ef0e8c012022a84b95d031849c331f04d43e00d947687acdd397e718d339491: Status 404 returned error can't find the container with id 9ef0e8c012022a84b95d031849c331f04d43e00d947687acdd397e718d339491 Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.636545 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.647471 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:22 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:22 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:22 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.647816 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:22 crc kubenswrapper[4848]: W1206 15:30:22.664738 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod239825af_9f47_45a4_933c_e60a3f4f54ad.slice/crio-6ad6d2287aa289198471366f0b23651b4b10919a71d133bbb5cffa9342eae5a8 WatchSource:0}: Error finding container 6ad6d2287aa289198471366f0b23651b4b10919a71d133bbb5cffa9342eae5a8: Status 404 returned error can't find the container with id 6ad6d2287aa289198471366f0b23651b4b10919a71d133bbb5cffa9342eae5a8 Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.668295 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.669914 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.169898892 +0000 UTC m=+90.467909815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.680475 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.683377 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.704284 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.711831 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-68b26"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.717247 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cjw6f"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.724969 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4hnzh" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.757604 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.770574 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.770783 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.270762152 +0000 UTC m=+90.568773065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.771095 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.771802 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.271786052 +0000 UTC m=+90.569796965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.775016 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r25sj" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.785434 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.863821 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.864152 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kptkr"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.869867 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.872372 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.874811 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.374777573 +0000 UTC m=+90.672788566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.885993 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.903614 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7h7ps"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.905040 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bksdf"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.906499 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w"] Dec 06 15:30:22 crc kubenswrapper[4848]: I1206 15:30:22.977319 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:22 crc kubenswrapper[4848]: E1206 15:30:22.977573 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.477562859 +0000 UTC m=+90.775573772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.079465 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.079809 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.579793669 +0000 UTC m=+90.877804582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.180794 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.181093 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.681081231 +0000 UTC m=+90.979092144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.282116 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.282272 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.78225041 +0000 UTC m=+91.080261333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.282408 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.282649 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.782639912 +0000 UTC m=+91.080650825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.382949 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.383179 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.883148972 +0000 UTC m=+91.181159905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.383279 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.383542 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.883531553 +0000 UTC m=+91.181542466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.484415 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.484567 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.984546877 +0000 UTC m=+91.282557790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.484955 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.485219 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:23.985212186 +0000 UTC m=+91.283223099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.585915 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.586071 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.086041515 +0000 UTC m=+91.384052438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.586140 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.586475 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.086464458 +0000 UTC m=+91.384475371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.601308 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ftn2g" event={"ID":"1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a","Type":"ContainerStarted","Data":"fe00495b8520d36bb861e700cd6af44554bf2c3f1515c3fd35b6fccaf47e3c6e"} Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.602315 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" event={"ID":"239825af-9f47-45a4-933c-e60a3f4f54ad","Type":"ContainerStarted","Data":"6ad6d2287aa289198471366f0b23651b4b10919a71d133bbb5cffa9342eae5a8"} Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.603225 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" event={"ID":"b1d6d035-9129-459e-b913-95a87f868196","Type":"ContainerStarted","Data":"2dbc467cbdeae5ab30075f1090dd7e42ca731b8ef9d84600197bc0b5c7c70e8b"} Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.604015 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" event={"ID":"b5de67fc-ba65-4752-b10a-86149771384a","Type":"ContainerStarted","Data":"65d808b5ec4545cd4e97f4f65b3adb6470c773403a02a6424001c53b61d3ded0"} Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.604820 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" event={"ID":"3f9090da-5bf9-4966-b42e-63de136da026","Type":"ContainerStarted","Data":"9ef0e8c012022a84b95d031849c331f04d43e00d947687acdd397e718d339491"} Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.605531 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" event={"ID":"14799933-c7a1-4fd1-9ab2-7d2d9bfb645d","Type":"ContainerStarted","Data":"bdacfc7000fd8c337cdb2bf1a66639348af2d10efe7dcf768fd1d0b862cbad10"} Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.637906 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:23 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:23 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:23 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.637961 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.687196 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.687575 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.187557314 +0000 UTC m=+91.485568227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.789729 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.790010 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.28999775 +0000 UTC m=+91.588008663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.891233 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.891434 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.391405426 +0000 UTC m=+91.689416339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.891535 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.891899 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.39188838 +0000 UTC m=+91.689899293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.982750 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q"] Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.982829 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk"] Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.992387 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.992680 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.492646897 +0000 UTC m=+91.790657810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:23 crc kubenswrapper[4848]: I1206 15:30:23.993035 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:23 crc kubenswrapper[4848]: E1206 15:30:23.993398 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.493390178 +0000 UTC m=+91.791401091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.006686 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm"] Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.011038 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nrffz"] Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.017315 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x"] Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.020865 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5gkxw"] Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.028434 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-nhnnj" Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.032566 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7"] Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.036002 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z"] Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.042611 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-szkmh" podStartSLOduration=72.042593738 podStartE2EDuration="1m12.042593738s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:24.036808791 +0000 UTC m=+91.334819704" watchObservedRunningTime="2025-12-06 15:30:24.042593738 +0000 UTC m=+91.340604651" Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.093552 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.093734 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.593686182 +0000 UTC m=+91.891697095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.093816 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.094122 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.594107374 +0000 UTC m=+91.892118377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.119490 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-nhnnj" podStartSLOduration=72.119471366 podStartE2EDuration="1m12.119471366s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:24.11717446 +0000 UTC m=+91.415185373" watchObservedRunningTime="2025-12-06 15:30:24.119471366 +0000 UTC m=+91.417482279" Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.196037 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.196146 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-mdg75" podStartSLOduration=72.196127468 podStartE2EDuration="1m12.196127468s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:24.158372709 +0000 UTC m=+91.456383622" watchObservedRunningTime="2025-12-06 15:30:24.196127468 +0000 UTC m=+91.494138381" Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.196215 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.69619499 +0000 UTC m=+91.994205903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.196362 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.196650 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.696641173 +0000 UTC m=+91.994652086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.297845 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.298048 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.798017318 +0000 UTC m=+92.096028231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.298250 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.298632 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.798619875 +0000 UTC m=+92.096630788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.317786 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2wcz4" podStartSLOduration=72.317767978 podStartE2EDuration="1m12.317767978s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:24.314964887 +0000 UTC m=+91.612975800" watchObservedRunningTime="2025-12-06 15:30:24.317767978 +0000 UTC m=+91.615778891" Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.399625 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.400099 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.899910048 +0000 UTC m=+92.197920971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.400145 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.400828 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:24.90067294 +0000 UTC m=+92.198683893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.501195 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.501612 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.001594721 +0000 UTC m=+92.299605634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.501646 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.501960 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.001945482 +0000 UTC m=+92.299956395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.603423 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.603595 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.103573534 +0000 UTC m=+92.401584447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.603818 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.604135 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.10412694 +0000 UTC m=+92.402137853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.611420 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" event={"ID":"55109807-aa20-4021-8a9f-f40b4c91c2df","Type":"ContainerStarted","Data":"1e329708038c6e88a199ebedb218c89acee49d960b9b3ad79aa8d9ceae139cfb"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.612318 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" event={"ID":"4a6e4efa-abe1-44da-9528-73fe113e016a","Type":"ContainerStarted","Data":"c6e72b60d91feb12f7e5e7b2fb1c49e91a6625d8c910855a837efe578d8f6640"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.613213 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" event={"ID":"da17a35e-90ac-4507-bc4f-b901f91051fe","Type":"ContainerStarted","Data":"5d991d892fdf58367c5856e8a5b5d88e2d13c0262466afc0c6f4ce6a2ee5e0f9"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.614009 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" event={"ID":"4e096d6f-01b9-44a6-ad86-ca6898ffed4e","Type":"ContainerStarted","Data":"ea8154685218caa7273cc7d60a63b7dbb2f592ddd7c4e1dc142ab2c378ec9d26"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.614824 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" event={"ID":"fcc11b1f-45ba-455f-a805-781f5512ebd2","Type":"ContainerStarted","Data":"39a315efc5a706c6388d700489662cbcc4162941c0c2c8cd0a5cff3629a75efe"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.615611 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7" event={"ID":"204ca83b-b95a-451a-bc43-a46bf0f2859d","Type":"ContainerStarted","Data":"95dfec043a9096fa2cb798930035bd6fad1dd93d094eaf6ae12051d7de0720d8"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.616302 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-68b26" event={"ID":"00b1bde8-f54b-4f0a-af8b-3b2e95066b05","Type":"ContainerStarted","Data":"5d6239b47cc0f7816b3ffecd9fdea50017f79930e20ca13f004da8b12f1dd998"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.617406 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" event={"ID":"0171f176-2943-4c53-b1f5-03bd6fec2a01","Type":"ContainerStarted","Data":"8b1d00baaa61c5421cd1a2f42e0d530c11aa18d5082e54984ed278e2624ed3bf"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.618225 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" event={"ID":"6056241e-bb6d-420b-9808-b9b3803a3c2d","Type":"ContainerStarted","Data":"00f0b10bc96b73fed58068c9e54ab52c94d55b507b80c65a536df9fcd76f99bf"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.618949 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" event={"ID":"acbec6ea-48e4-4f2b-a5fc-070a06599b01","Type":"ContainerStarted","Data":"4f52659796c2087a2d95c6abb6ebf9fdbe450b05d3deb3252aadb322f1ba6fd4"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.621685 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" event={"ID":"79983c07-34c7-4539-958e-73168bfa669d","Type":"ContainerStarted","Data":"abcc7c8599732f9bf0a48a83d7415291d1163fcd190c6f53a8fadae612ede671"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.633876 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" event={"ID":"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2","Type":"ContainerStarted","Data":"675879cf0dd6ef6363ac9b788b45d0d2c2b5284aa1b7215f8c4bf75526aa9713"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.638304 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r25sj"] Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.639611 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:24 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:24 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:24 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.639689 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.643085 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" event={"ID":"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7","Type":"ContainerStarted","Data":"7794560e787c650599a06a3f36699ddf907ee77368db35435cc395a27c9ad40a"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.647043 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" event={"ID":"e4045402-271a-4cb5-9656-fb86ea257eb3","Type":"ContainerStarted","Data":"26b51f86f0bb13dcd9cae5550ff9d1fdb2e9ebf59282b956c64c8f220647ef47"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.648638 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" event={"ID":"91a5aab3-40d9-437c-9332-8099a6a2380b","Type":"ContainerStarted","Data":"9a2d3a028c0dd0645949e6065eb993fcf8965234138068d23960d76a571c23da"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.651235 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" event={"ID":"e5fee749-2cc4-41ea-9b22-7499624ae892","Type":"ContainerStarted","Data":"2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.653076 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" event={"ID":"dfa0e198-c523-4da6-93c0-59350986464e","Type":"ContainerStarted","Data":"252fb35c730b5d5ccc15e1fce1e5d63f413847e7a9c2d6180959cc52b1fee773"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.654413 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4hnzh" event={"ID":"acea0ac6-d2f8-4b51-8bce-008f0176f6ce","Type":"ContainerStarted","Data":"b379de70708c2e901d81d77791477f3366396b98f01d7d4c64beb5ffb9fa4b94"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.658845 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" event={"ID":"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af","Type":"ContainerStarted","Data":"2f52718f29bac87a0ebecc27070b264c1c090ca2c1fbe690cd8eaca1cc948684"} Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.705310 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.705471 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.205448634 +0000 UTC m=+92.503459547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.705534 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.706203 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.206191705 +0000 UTC m=+92.504202618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.806653 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.806825 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.306790357 +0000 UTC m=+92.604801270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.807177 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.807427 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.307415436 +0000 UTC m=+92.605426339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.833154 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg"] Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.909645 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.909796 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.409775649 +0000 UTC m=+92.707786562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.909979 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:24 crc kubenswrapper[4848]: E1206 15:30:24.911709 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.411684354 +0000 UTC m=+92.709695267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:24 crc kubenswrapper[4848]: I1206 15:30:24.933208 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w8gt5"] Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.012382 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.012659 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.512644157 +0000 UTC m=+92.810655060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: W1206 15:30:25.079831 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab287f93_4a57_497c_a075_007f697c2bb0.slice/crio-c8a6046528582522bf387ea79a5f9af1f94fb5043292b8da26bc7330a62935de WatchSource:0}: Error finding container c8a6046528582522bf387ea79a5f9af1f94fb5043292b8da26bc7330a62935de: Status 404 returned error can't find the container with id c8a6046528582522bf387ea79a5f9af1f94fb5043292b8da26bc7330a62935de Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.114581 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.114969 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.614955788 +0000 UTC m=+92.912966701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.135609 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zq7wp"] Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.216409 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.216564 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.71653565 +0000 UTC m=+93.014546563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.216678 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.217112 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.717102026 +0000 UTC m=+93.015112939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.317416 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.317784 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.81776686 +0000 UTC m=+93.115777773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.317928 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.318194 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.818185683 +0000 UTC m=+93.116196596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.419475 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.419764 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.919739823 +0000 UTC m=+93.217750736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.419859 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.420106 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:25.920098373 +0000 UTC m=+93.218109286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.520687 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.520970 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.020943902 +0000 UTC m=+93.318954815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.521146 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.521450 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.021443457 +0000 UTC m=+93.319454370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.623816 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.624650 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.124632424 +0000 UTC m=+93.422643337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.653495 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:25 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:25 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:25 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.653552 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.712894 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" event={"ID":"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af","Type":"ContainerStarted","Data":"381228af11f61d6faa97b266797895d81a627d9204a765a39eb832bfe76bae8d"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.728257 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.728628 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.228608535 +0000 UTC m=+93.526619448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.739834 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" event={"ID":"6cb3f509-a2b5-4471-8e6c-d87f7d5289d2","Type":"ContainerStarted","Data":"9e67366d27880d967a3a1cc0ae0973711ecb8770bc207dbd63369fb811f52344"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.742987 4848 generic.go:334] "Generic (PLEG): container finished" podID="c457951c-652c-4d1a-8478-507cfe00cd41" containerID="142abfa27d2666cd429fab09121c073b255500a0c5300ed8a5a20b74c81520fc" exitCode=0 Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.743613 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" event={"ID":"c457951c-652c-4d1a-8478-507cfe00cd41","Type":"ContainerDied","Data":"142abfa27d2666cd429fab09121c073b255500a0c5300ed8a5a20b74c81520fc"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.766109 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9p2lr" podStartSLOduration=73.766094466 podStartE2EDuration="1m13.766094466s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:25.765150699 +0000 UTC m=+93.063161612" watchObservedRunningTime="2025-12-06 15:30:25.766094466 +0000 UTC m=+93.064105379" Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.804328 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" event={"ID":"239825af-9f47-45a4-933c-e60a3f4f54ad","Type":"ContainerStarted","Data":"a04fd3861b952ab9a77957ed192396b188301b2560d2106e1dd6fe009ba8c25f"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.835357 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.835555 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.33553516 +0000 UTC m=+93.633546073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.836089 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.837077 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.337064264 +0000 UTC m=+93.635075237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.849307 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" event={"ID":"3f9090da-5bf9-4966-b42e-63de136da026","Type":"ContainerStarted","Data":"3b2aabd414a7cb16455698776eb0bcd4504c83872226db066a21e0f429c85f92"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.879880 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" event={"ID":"ab287f93-4a57-497c-a075-007f697c2bb0","Type":"ContainerStarted","Data":"c8a6046528582522bf387ea79a5f9af1f94fb5043292b8da26bc7330a62935de"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.893574 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" event={"ID":"acbec6ea-48e4-4f2b-a5fc-070a06599b01","Type":"ContainerStarted","Data":"80c0c96d17f74e4625bcf0fe98bda03c3fa042890b60c6d41970890288c23b27"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.896224 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8" event={"ID":"861c6ec8-def2-40b1-8b82-c99e683232ec","Type":"ContainerStarted","Data":"583a031df36d338a90ae8665abc981db25ed169819a1719f74b0d4b14d9d8bcb"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.934815 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" event={"ID":"b1d6d035-9129-459e-b913-95a87f868196","Type":"ContainerStarted","Data":"140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.935951 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.936569 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:25 crc kubenswrapper[4848]: E1206 15:30:25.937350 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.437336057 +0000 UTC m=+93.735346970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.951675 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" event={"ID":"4a6e4efa-abe1-44da-9528-73fe113e016a","Type":"ContainerStarted","Data":"d4b8904566dba5d77d031fdb489ed4227d427838bea391400e82360cdef7d364"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.957032 4848 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-48n2s container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.957074 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" podUID="b1d6d035-9129-459e-b913-95a87f868196" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.961768 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" event={"ID":"d97d4b84-731c-4def-b9a9-a436e63f6f67","Type":"ContainerStarted","Data":"16cbc2f02f7275c07fd9c96cdb86e48054a057dec7a0db225066243e2c671062"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.963363 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qvx45" podStartSLOduration=73.963349437 podStartE2EDuration="1m13.963349437s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:25.882899336 +0000 UTC m=+93.180910249" watchObservedRunningTime="2025-12-06 15:30:25.963349437 +0000 UTC m=+93.261360350" Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.989187 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ftn2g" event={"ID":"1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a","Type":"ContainerStarted","Data":"a45ecb55be3fcc266127bd956c2ee7c5b939e7820968005512d5e033013038a8"} Dec 06 15:30:25 crc kubenswrapper[4848]: I1206 15:30:25.989233 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ftn2g" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.004408 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" podStartSLOduration=74.004394362 podStartE2EDuration="1m14.004394362s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:25.964652435 +0000 UTC m=+93.262663368" watchObservedRunningTime="2025-12-06 15:30:26.004394362 +0000 UTC m=+93.302405275" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.007908 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" event={"ID":"cbe6fc6e-d376-42f6-af93-04efcd6aa6f7","Type":"ContainerStarted","Data":"2c337da637621cadda5013274b7c04692eaa15c940879092c0522f51ca24e180"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.008879 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.016252 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-ftn2g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.016295 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ftn2g" podUID="1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.019964 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" event={"ID":"e4045402-271a-4cb5-9656-fb86ea257eb3","Type":"ContainerStarted","Data":"8e1e92b83b17c6e09952f53f99179aeaf22d307f2be70797b5f16172c4943198"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.036054 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6kl6l" podStartSLOduration=75.036036544 podStartE2EDuration="1m15.036036544s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.005915416 +0000 UTC m=+93.303926329" watchObservedRunningTime="2025-12-06 15:30:26.036036544 +0000 UTC m=+93.334047457" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.036529 4848 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4f25z container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.036577 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" podUID="cbe6fc6e-d376-42f6-af93-04efcd6aa6f7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.044589 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:26 crc kubenswrapper[4848]: E1206 15:30:26.045216 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.545197149 +0000 UTC m=+93.843208142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.048468 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7" event={"ID":"204ca83b-b95a-451a-bc43-a46bf0f2859d","Type":"ContainerStarted","Data":"7eb5923cd76ea6906396331a6b8f239928847d06dc1efa1f0206703426d8bd62"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.084782 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" event={"ID":"30cecc00-7a51-4ab5-b23e-399221fd73d8","Type":"ContainerStarted","Data":"4d7deb7e67e150a3cc749cc5ef91467f97d8a6c0546008e659ebad560a4d6762"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.086942 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2x6w" podStartSLOduration=74.086932243 podStartE2EDuration="1m14.086932243s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.084540554 +0000 UTC m=+93.382551467" watchObservedRunningTime="2025-12-06 15:30:26.086932243 +0000 UTC m=+93.384943156" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.087321 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ftn2g" podStartSLOduration=74.087316864 podStartE2EDuration="1m14.087316864s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.035997533 +0000 UTC m=+93.334008446" watchObservedRunningTime="2025-12-06 15:30:26.087316864 +0000 UTC m=+93.385327767" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.107582 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" event={"ID":"14799933-c7a1-4fd1-9ab2-7d2d9bfb645d","Type":"ContainerStarted","Data":"2b3b91c8463647976b8141a935a7102bb4d91a35bf192c87ebba08bc8662f4fb"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.121321 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" event={"ID":"02495680-6e6d-4771-9cb8-8e3324e7c1c2","Type":"ContainerStarted","Data":"d0ea9b9a941111d7f675fcd0b217a8aa1b0a4861d3c9bf8bcddc86942d655286"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.140007 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" podStartSLOduration=74.139987333 podStartE2EDuration="1m14.139987333s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.129659426 +0000 UTC m=+93.427670329" watchObservedRunningTime="2025-12-06 15:30:26.139987333 +0000 UTC m=+93.437998246" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.142681 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" event={"ID":"06fa2713-32ed-4992-8024-48cfb318926d","Type":"ContainerStarted","Data":"eb9f7be892d9ec278613dd4f2592d290b133ab59d6bf0392a7ba3eb7643891f8"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.150130 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zq7wp" event={"ID":"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0","Type":"ContainerStarted","Data":"34fb2c3cb4357b65e0c0063e42ed90ec75a07440f37620ffefd00c4a7e588fc5"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.150314 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:26 crc kubenswrapper[4848]: E1206 15:30:26.150601 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.650574789 +0000 UTC m=+93.948585722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.150843 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:26 crc kubenswrapper[4848]: E1206 15:30:26.152483 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.652473704 +0000 UTC m=+93.950484617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.161372 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6wjr7" podStartSLOduration=74.16135464 podStartE2EDuration="1m14.16135464s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.159874818 +0000 UTC m=+93.457885731" watchObservedRunningTime="2025-12-06 15:30:26.16135464 +0000 UTC m=+93.459365553" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.188334 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" event={"ID":"9f8b2880-f901-410a-a27e-60691016e54e","Type":"ContainerStarted","Data":"445b4d523702f92277d7a13853171c31986a4f9ffef382681a511c2163d60e8c"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.196028 4848 generic.go:334] "Generic (PLEG): container finished" podID="a9cbe0f4-d9df-4be5-a1b8-4c271ce24649" containerID="ea00e2969f35efd01f2208f1572e31437a0ed924cfa0ed8f9f9e94307b57a7e5" exitCode=0 Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.196101 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" event={"ID":"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649","Type":"ContainerDied","Data":"ea00e2969f35efd01f2208f1572e31437a0ed924cfa0ed8f9f9e94307b57a7e5"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.200265 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-65225" podStartSLOduration=75.200248633 podStartE2EDuration="1m15.200248633s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.199009257 +0000 UTC m=+93.497020170" watchObservedRunningTime="2025-12-06 15:30:26.200248633 +0000 UTC m=+93.498259546" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.202528 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r25sj" event={"ID":"05a9b88b-e88b-4af7-96bd-8bd051fb9353","Type":"ContainerStarted","Data":"9c2153f9fd8c1f1e5364a6095360b47996a83830b43e796ff7a5e3a4c39fd962"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.202563 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r25sj" event={"ID":"05a9b88b-e88b-4af7-96bd-8bd051fb9353","Type":"ContainerStarted","Data":"1efb1ae67acddd80cfbdcd77b4a084a0633dbc583b23a1489e62350b2d21960d"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.215866 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" event={"ID":"55109807-aa20-4021-8a9f-f40b4c91c2df","Type":"ContainerStarted","Data":"c95ca69fafaffd6980c89a1175c8306242247e0a20d5daa16092ccb41e3fdc8e"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.217821 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" event={"ID":"b5de67fc-ba65-4752-b10a-86149771384a","Type":"ContainerStarted","Data":"9ca702a505344b994f1d4cf8b806dcf50e7333e1796c4fce986c80b65af57e18"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.218437 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.247176 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" event={"ID":"6056241e-bb6d-420b-9808-b9b3803a3c2d","Type":"ContainerStarted","Data":"4076a09d59ac0d5d7055a84db2c7974c7145f95532fbe10b74659660bf761617"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.248003 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.251891 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:26 crc kubenswrapper[4848]: E1206 15:30:26.253279 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.753264732 +0000 UTC m=+94.051275645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.263393 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.269526 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xrdhr" podStartSLOduration=74.269504641 podStartE2EDuration="1m14.269504641s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.249776912 +0000 UTC m=+93.547787825" watchObservedRunningTime="2025-12-06 15:30:26.269504641 +0000 UTC m=+93.567515614" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.269964 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-79fvk" podStartSLOduration=75.269959274 podStartE2EDuration="1m15.269959274s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.218915191 +0000 UTC m=+93.516926104" watchObservedRunningTime="2025-12-06 15:30:26.269959274 +0000 UTC m=+93.567970187" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.280468 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" podStartSLOduration=74.280452497 podStartE2EDuration="1m14.280452497s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.279919981 +0000 UTC m=+93.577930894" watchObservedRunningTime="2025-12-06 15:30:26.280452497 +0000 UTC m=+93.578463410" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.290600 4848 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5gkxw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.290797 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" podUID="6056241e-bb6d-420b-9808-b9b3803a3c2d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.306097 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" event={"ID":"4e096d6f-01b9-44a6-ad86-ca6898ffed4e","Type":"ContainerStarted","Data":"aaa364bd8183cdf0b2f29f89655f39a08b15a44f406b9fa751ab4125e64d1c97"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.308105 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.320240 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r25sj" podStartSLOduration=7.320220144 podStartE2EDuration="7.320220144s" podCreationTimestamp="2025-12-06 15:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.309401841 +0000 UTC m=+93.607412754" watchObservedRunningTime="2025-12-06 15:30:26.320220144 +0000 UTC m=+93.618231057" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.322352 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" event={"ID":"dfa0e198-c523-4da6-93c0-59350986464e","Type":"ContainerStarted","Data":"d82b383a10f6af795c47c85978540cb2d041f040ace385ebddce3f74b1599582"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.325904 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" event={"ID":"fcc11b1f-45ba-455f-a805-781f5512ebd2","Type":"ContainerStarted","Data":"90260b07ffe89e386824834edc981daad3f046983d59f27cc6a0812ea7e0a281"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.327144 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.357468 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:26 crc kubenswrapper[4848]: E1206 15:30:26.359550 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.859537548 +0000 UTC m=+94.157548461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.359681 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4hnzh" event={"ID":"acea0ac6-d2f8-4b51-8bce-008f0176f6ce","Type":"ContainerStarted","Data":"d300467005b7be945e40d3617987e73068491a51876ad61a3454c63bd126e7b4"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.363902 4848 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-l7mnj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" start-of-body= Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.363937 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" podUID="fcc11b1f-45ba-455f-a805-781f5512ebd2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.377096 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" podStartSLOduration=26.377073135 podStartE2EDuration="26.377073135s" podCreationTimestamp="2025-12-06 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.341758996 +0000 UTC m=+93.639769909" watchObservedRunningTime="2025-12-06 15:30:26.377073135 +0000 UTC m=+93.675084048" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.392197 4848 generic.go:334] "Generic (PLEG): container finished" podID="0171f176-2943-4c53-b1f5-03bd6fec2a01" containerID="a59d7f6d5234d60a064e8812ffe114303e74417fa7af22334a049940be94e0f8" exitCode=0 Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.393000 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" event={"ID":"0171f176-2943-4c53-b1f5-03bd6fec2a01","Type":"ContainerDied","Data":"a59d7f6d5234d60a064e8812ffe114303e74417fa7af22334a049940be94e0f8"} Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.393077 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.406775 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" podStartSLOduration=74.406759051 podStartE2EDuration="1m14.406759051s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.403363243 +0000 UTC m=+93.701374146" watchObservedRunningTime="2025-12-06 15:30:26.406759051 +0000 UTC m=+93.704769954" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.422894 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nz6tt" podStartSLOduration=74.422878916 podStartE2EDuration="1m14.422878916s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.42095833 +0000 UTC m=+93.718969243" watchObservedRunningTime="2025-12-06 15:30:26.422878916 +0000 UTC m=+93.720889829" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.458841 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:26 crc kubenswrapper[4848]: E1206 15:30:26.459935 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:26.959915255 +0000 UTC m=+94.257926158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.465713 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" podStartSLOduration=74.465685531 podStartE2EDuration="1m14.465685531s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.443363637 +0000 UTC m=+93.741374550" watchObservedRunningTime="2025-12-06 15:30:26.465685531 +0000 UTC m=+93.763696444" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.512002 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" podStartSLOduration=74.511983447 podStartE2EDuration="1m14.511983447s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.511968756 +0000 UTC m=+93.809979669" watchObservedRunningTime="2025-12-06 15:30:26.511983447 +0000 UTC m=+93.809994360" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.512795 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-4hnzh" podStartSLOduration=7.51279057 podStartE2EDuration="7.51279057s" podCreationTimestamp="2025-12-06 15:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.488200121 +0000 UTC m=+93.786211034" watchObservedRunningTime="2025-12-06 15:30:26.51279057 +0000 UTC m=+93.810801483" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.558581 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nrffz" podStartSLOduration=74.55856177 podStartE2EDuration="1m14.55856177s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.551502697 +0000 UTC m=+93.849513610" watchObservedRunningTime="2025-12-06 15:30:26.55856177 +0000 UTC m=+93.856572683" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.560185 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:26 crc kubenswrapper[4848]: E1206 15:30:26.560578 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:27.060561419 +0000 UTC m=+94.358572332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.592036 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" podStartSLOduration=75.592014786 podStartE2EDuration="1m15.592014786s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:26.590931905 +0000 UTC m=+93.888942818" watchObservedRunningTime="2025-12-06 15:30:26.592014786 +0000 UTC m=+93.890025699" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.606181 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zdp8x" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.641448 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:26 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:26 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:26 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.641504 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.662926 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:26 crc kubenswrapper[4848]: E1206 15:30:26.663887 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:27.16387007 +0000 UTC m=+94.461880983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.764522 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.767116 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:26 crc kubenswrapper[4848]: E1206 15:30:26.767429 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:27.267418797 +0000 UTC m=+94.565429710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.871610 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:26 crc kubenswrapper[4848]: E1206 15:30:26.872170 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:27.372153498 +0000 UTC m=+94.670164411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:26 crc kubenswrapper[4848]: I1206 15:30:26.973144 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:26 crc kubenswrapper[4848]: E1206 15:30:26.973482 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:27.473466622 +0000 UTC m=+94.771477535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.074086 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:27 crc kubenswrapper[4848]: E1206 15:30:27.074277 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:27.57424491 +0000 UTC m=+94.872255823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.176110 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:27 crc kubenswrapper[4848]: E1206 15:30:27.176488 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:27.67647317 +0000 UTC m=+94.974484093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.277355 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:27 crc kubenswrapper[4848]: E1206 15:30:27.277513 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:27.777488914 +0000 UTC m=+95.075499837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.277654 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:27 crc kubenswrapper[4848]: E1206 15:30:27.278041 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:27.778030209 +0000 UTC m=+95.076041122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.378322 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:27 crc kubenswrapper[4848]: E1206 15:30:27.378647 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:27.878630652 +0000 UTC m=+95.176641565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.402533 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zq7wp" event={"ID":"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0","Type":"ContainerStarted","Data":"93ce28b019459f05218fef4d23dbcc175cef84fb69cf086c51676e2d93bf2c77"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.402587 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zq7wp" event={"ID":"b5bf9a89-1638-4bf9-b2a8-41f96e5220d0","Type":"ContainerStarted","Data":"5f0ecb856e5a9362abc0ab29a45ae02b88f0200bb14a85f35c71fa4fe42963c2"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.403364 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.422573 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" event={"ID":"79983c07-34c7-4539-958e-73168bfa669d","Type":"ContainerStarted","Data":"4dfade063102e67623879e3d41ec73b97af1c052fcb30846633246987f7a2c06"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.430055 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zq7wp" podStartSLOduration=8.430039986 podStartE2EDuration="8.430039986s" podCreationTimestamp="2025-12-06 15:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:27.42707137 +0000 UTC m=+94.725082283" watchObservedRunningTime="2025-12-06 15:30:27.430039986 +0000 UTC m=+94.728050899" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.430093 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" event={"ID":"c457951c-652c-4d1a-8478-507cfe00cd41","Type":"ContainerStarted","Data":"900cd1bb4d79171ee748844f82b22d645bfa2ca388e196ca413c6e87de84a55d"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.445439 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-68b26" event={"ID":"00b1bde8-f54b-4f0a-af8b-3b2e95066b05","Type":"ContainerStarted","Data":"a7cfb7677abd13868bf5862607c2ab2e5997d6a135dd618fea4eb9948bbcd788"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.445504 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-68b26" event={"ID":"00b1bde8-f54b-4f0a-af8b-3b2e95066b05","Type":"ContainerStarted","Data":"c5725383c5c130d86cf30fabba883862230342251d3b6ff610d3c682f1c06460"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.463284 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" event={"ID":"ab287f93-4a57-497c-a075-007f697c2bb0","Type":"ContainerStarted","Data":"adc0695fa50a45aa3a95283426692bbc89d4e153a84cf1184156e30670fc6c9f"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.463328 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" event={"ID":"ab287f93-4a57-497c-a075-007f697c2bb0","Type":"ContainerStarted","Data":"4ffe6967063269fdeff85cf2760edd122715c01bdad94dca8f05bc5312cc7941"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.463382 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.467075 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" event={"ID":"9f8b2880-f901-410a-a27e-60691016e54e","Type":"ContainerStarted","Data":"85c6b78d39ca7740d563fe4939f94ce75fb9d0748a6e63764dbd4185f4d66aa6"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.475851 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" event={"ID":"0171f176-2943-4c53-b1f5-03bd6fec2a01","Type":"ContainerStarted","Data":"79f117e11590e5fed718a07fa1b06ae0262e442bd28b6e9f21b6db628bcc193c"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.486713 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:27 crc kubenswrapper[4848]: E1206 15:30:27.488511 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:27.988496053 +0000 UTC m=+95.286506966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.497150 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8" event={"ID":"861c6ec8-def2-40b1-8b82-c99e683232ec","Type":"ContainerStarted","Data":"993cc93e87ca4da29dfffe44da81f143cd2da48648a50bd75c9c984697984043"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.524615 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" event={"ID":"a9cbe0f4-d9df-4be5-a1b8-4c271ce24649","Type":"ContainerStarted","Data":"467bc40a2ac16a84623e5c1da27dbdc49a03be38b53a916044638ce0b4456220"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.525224 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.549048 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-bksdf" podStartSLOduration=75.549035209 podStartE2EDuration="1m15.549035209s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:27.462833641 +0000 UTC m=+94.760844624" watchObservedRunningTime="2025-12-06 15:30:27.549035209 +0000 UTC m=+94.847046122" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.567898 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" event={"ID":"ee5ab1d8-ba27-41de-a7e6-aed8c51b45af","Type":"ContainerStarted","Data":"099ef1120c015e7545e558206ede900f5b7bf2f5c46a3a1df2b00b5b3679d4f2"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.589155 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:27 crc kubenswrapper[4848]: E1206 15:30:27.590119 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:28.090104394 +0000 UTC m=+95.388115307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.611096 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" event={"ID":"4a6e4efa-abe1-44da-9528-73fe113e016a","Type":"ContainerStarted","Data":"aac59491d1166b3bb54b99e9e2b647249920d8900bd6445df06aa87da78b9015"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.620647 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" podStartSLOduration=75.620634205 podStartE2EDuration="1m15.620634205s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:27.619965685 +0000 UTC m=+94.917976598" watchObservedRunningTime="2025-12-06 15:30:27.620634205 +0000 UTC m=+94.918645118" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.635116 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" event={"ID":"91a5aab3-40d9-437c-9332-8099a6a2380b","Type":"ContainerStarted","Data":"6d04042f5ecd933644aae8b2df94a03acd5b5d8f19570fbff61a363224206d42"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.635160 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" event={"ID":"91a5aab3-40d9-437c-9332-8099a6a2380b","Type":"ContainerStarted","Data":"0bf11de8195cddabc360070b1c9330fe0d836233ba1d46eb0f12f4b61df8c7e1"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.645065 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:27 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:27 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:27 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.645115 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.666455 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" event={"ID":"239825af-9f47-45a4-933c-e60a3f4f54ad","Type":"ContainerStarted","Data":"94a9275988ba34b7482990fbd8b3236e2f1f822f470dd821403c131c8772617b"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.687123 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" event={"ID":"da17a35e-90ac-4507-bc4f-b901f91051fe","Type":"ContainerStarted","Data":"d34f1c2ff835bb174048528e4240e7a3bea9df7412257f357e748914bf5b0077"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.693482 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:27 crc kubenswrapper[4848]: E1206 15:30:27.694567 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:28.194553228 +0000 UTC m=+95.492564141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.728177 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" event={"ID":"acbec6ea-48e4-4f2b-a5fc-070a06599b01","Type":"ContainerStarted","Data":"9bbab0ce3fe30da1c222087651e7431aa2a75fdd0c34988a5c9f57a588ac0564"} Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.730478 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-ftn2g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.730519 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ftn2g" podUID="1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.735274 4848 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5gkxw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.735326 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" podUID="6056241e-bb6d-420b-9808-b9b3803a3c2d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.738903 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-68b26" podStartSLOduration=75.738891706 podStartE2EDuration="1m15.738891706s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:27.729240349 +0000 UTC m=+95.027251262" watchObservedRunningTime="2025-12-06 15:30:27.738891706 +0000 UTC m=+95.036902609" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.744721 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.758976 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f25z" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.797152 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:27 crc kubenswrapper[4848]: E1206 15:30:27.798561 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:28.298538058 +0000 UTC m=+95.596548971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.881925 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" podStartSLOduration=76.881898673 podStartE2EDuration="1m16.881898673s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:27.876037553 +0000 UTC m=+95.174048476" watchObservedRunningTime="2025-12-06 15:30:27.881898673 +0000 UTC m=+95.179909586" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.899645 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:27 crc kubenswrapper[4848]: E1206 15:30:27.900072 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:28.400060527 +0000 UTC m=+95.698071440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.920098 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-kptkr" podStartSLOduration=75.920076355 podStartE2EDuration="1m15.920076355s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:27.910143568 +0000 UTC m=+95.208154481" watchObservedRunningTime="2025-12-06 15:30:27.920076355 +0000 UTC m=+95.218087258" Dec 06 15:30:27 crc kubenswrapper[4848]: I1206 15:30:27.943073 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7h7ps" podStartSLOduration=75.943057917 podStartE2EDuration="1m15.943057917s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:27.941504582 +0000 UTC m=+95.239515495" watchObservedRunningTime="2025-12-06 15:30:27.943057917 +0000 UTC m=+95.241068820" Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.002395 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.002584 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:28.502559175 +0000 UTC m=+95.800570088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.002681 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.003143 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:28.503131141 +0000 UTC m=+95.801142054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.052564 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8dnqm" podStartSLOduration=76.052549306 podStartE2EDuration="1m16.052549306s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:27.978903182 +0000 UTC m=+95.276914095" watchObservedRunningTime="2025-12-06 15:30:28.052549306 +0000 UTC m=+95.350560219" Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.053754 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk9hv" podStartSLOduration=76.053748561 podStartE2EDuration="1m16.053748561s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:28.029584584 +0000 UTC m=+95.327595497" watchObservedRunningTime="2025-12-06 15:30:28.053748561 +0000 UTC m=+95.351759474" Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.078767 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndv9m" podStartSLOduration=76.078751822 podStartE2EDuration="1m16.078751822s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:28.077030053 +0000 UTC m=+95.375040966" watchObservedRunningTime="2025-12-06 15:30:28.078751822 +0000 UTC m=+95.376762725" Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.104194 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.104566 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:28.604551297 +0000 UTC m=+95.902562210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.166096 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f2c5q" podStartSLOduration=76.166080163 podStartE2EDuration="1m16.166080163s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:28.13101042 +0000 UTC m=+95.429021333" watchObservedRunningTime="2025-12-06 15:30:28.166080163 +0000 UTC m=+95.464091076" Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.167591 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" podStartSLOduration=76.167580136 podStartE2EDuration="1m16.167580136s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:28.164649082 +0000 UTC m=+95.462660005" watchObservedRunningTime="2025-12-06 15:30:28.167580136 +0000 UTC m=+95.465591049" Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.191152 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c5lt8" podStartSLOduration=76.191137335 podStartE2EDuration="1m16.191137335s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:28.190435345 +0000 UTC m=+95.488446258" watchObservedRunningTime="2025-12-06 15:30:28.191137335 +0000 UTC m=+95.489148248" Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.206034 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.206404 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:28.706388645 +0000 UTC m=+96.004399558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.308277 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.308825 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:28.80879972 +0000 UTC m=+96.106810633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.410237 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.410600 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:28.910584437 +0000 UTC m=+96.208595350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.410669 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7mnj" Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.511207 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.511725 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:29.011711114 +0000 UTC m=+96.309722027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.613098 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.613509 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:29.113493221 +0000 UTC m=+96.411504124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.638512 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:28 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:28 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:28 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.638827 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.713845 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.714047 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:29.214004732 +0000 UTC m=+96.512015645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.714129 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.714438 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:29.214426654 +0000 UTC m=+96.512437567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.737358 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" event={"ID":"0171f176-2943-4c53-b1f5-03bd6fec2a01","Type":"ContainerStarted","Data":"97f62023e5b090bce9a25b2ad3f737dab911fb91516cd1fe8cc0ab901f2e210b"} Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.739337 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" event={"ID":"9f8b2880-f901-410a-a27e-60691016e54e","Type":"ContainerStarted","Data":"8d739d4a721ce0dade01a4c3b67b173a6267a3be5b0a8e08558aa40d7a1bedbc"} Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.745448 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.792241 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" podStartSLOduration=77.792218219 podStartE2EDuration="1m17.792218219s" podCreationTimestamp="2025-12-06 15:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:28.784865336 +0000 UTC m=+96.082876249" watchObservedRunningTime="2025-12-06 15:30:28.792218219 +0000 UTC m=+96.090229132" Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.817810 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.819604 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:29.319578877 +0000 UTC m=+96.617589790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.920911 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:28 crc kubenswrapper[4848]: E1206 15:30:28.921304 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:29.421290172 +0000 UTC m=+96.719301085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.997317 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lgzqt"] Dec 06 15:30:28 crc kubenswrapper[4848]: I1206 15:30:28.998233 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.000204 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.021577 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:29 crc kubenswrapper[4848]: E1206 15:30:29.021792 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:29.521773332 +0000 UTC m=+96.819784245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.021846 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.021885 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znfpr\" (UniqueName: \"kubernetes.io/projected/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-kube-api-access-znfpr\") pod \"community-operators-lgzqt\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.021928 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-catalog-content\") pod \"community-operators-lgzqt\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.021965 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-utilities\") pod \"community-operators-lgzqt\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:30:29 crc kubenswrapper[4848]: E1206 15:30:29.022257 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:29.522248185 +0000 UTC m=+96.820259088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.031320 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lgzqt"] Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.099519 4848 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.100382 4848 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-06T15:30:29.099544685Z","Handler":null,"Name":""} Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.123099 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.123281 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-catalog-content\") pod \"community-operators-lgzqt\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:30:29 crc kubenswrapper[4848]: E1206 15:30:29.123303 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 15:30:29.62328535 +0000 UTC m=+96.921296263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.123326 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-utilities\") pod \"community-operators-lgzqt\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.123379 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.123411 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znfpr\" (UniqueName: \"kubernetes.io/projected/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-kube-api-access-znfpr\") pod \"community-operators-lgzqt\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:30:29 crc kubenswrapper[4848]: E1206 15:30:29.123720 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 15:30:29.623686341 +0000 UTC m=+96.921697254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56mp5" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.124192 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-catalog-content\") pod \"community-operators-lgzqt\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.124371 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-utilities\") pod \"community-operators-lgzqt\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.147334 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znfpr\" (UniqueName: \"kubernetes.io/projected/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-kube-api-access-znfpr\") pod \"community-operators-lgzqt\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.184851 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fgcg8"] Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.186847 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.190598 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.195585 4848 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.195642 4848 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.210097 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fgcg8"] Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.224412 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.224776 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-catalog-content\") pod \"certified-operators-fgcg8\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.224827 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tfq5\" (UniqueName: \"kubernetes.io/projected/c93d43de-aa47-4357-b786-aa586a35d462-kube-api-access-2tfq5\") pod \"certified-operators-fgcg8\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.224963 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-utilities\") pod \"certified-operators-fgcg8\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.247120 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.325423 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-catalog-content\") pod \"certified-operators-fgcg8\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.325471 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tfq5\" (UniqueName: \"kubernetes.io/projected/c93d43de-aa47-4357-b786-aa586a35d462-kube-api-access-2tfq5\") pod \"certified-operators-fgcg8\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.325551 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.325616 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-utilities\") pod \"certified-operators-fgcg8\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.325912 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-catalog-content\") pod \"certified-operators-fgcg8\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.325971 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-utilities\") pod \"certified-operators-fgcg8\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.330185 4848 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.330211 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.332824 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.351216 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tfq5\" (UniqueName: \"kubernetes.io/projected/c93d43de-aa47-4357-b786-aa586a35d462-kube-api-access-2tfq5\") pod \"certified-operators-fgcg8\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.374987 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n7jzh"] Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.375949 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.384950 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56mp5\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.386419 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7jzh"] Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.427045 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxpxj\" (UniqueName: \"kubernetes.io/projected/b5528992-4741-41f7-bd58-d7614c936639-kube-api-access-vxpxj\") pod \"community-operators-n7jzh\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.427140 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-utilities\") pod \"community-operators-n7jzh\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.427182 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-catalog-content\") pod \"community-operators-n7jzh\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.489159 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.528828 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxpxj\" (UniqueName: \"kubernetes.io/projected/b5528992-4741-41f7-bd58-d7614c936639-kube-api-access-vxpxj\") pod \"community-operators-n7jzh\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.529125 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-utilities\") pod \"community-operators-n7jzh\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.529154 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-catalog-content\") pod \"community-operators-n7jzh\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.529564 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-catalog-content\") pod \"community-operators-n7jzh\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.530134 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-utilities\") pod \"community-operators-n7jzh\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.558842 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxpxj\" (UniqueName: \"kubernetes.io/projected/b5528992-4741-41f7-bd58-d7614c936639-kube-api-access-vxpxj\") pod \"community-operators-n7jzh\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.568270 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.583195 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4prm4"] Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.587169 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.604044 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4prm4"] Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.648361 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:29 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:29 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:29 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.648402 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.700143 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.722020 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lgzqt"] Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.738310 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-utilities\") pod \"certified-operators-4prm4\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.738347 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9597\" (UniqueName: \"kubernetes.io/projected/a3734c39-129f-4b25-be9d-e2ca36e98de3-kube-api-access-s9597\") pod \"certified-operators-4prm4\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.738369 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-catalog-content\") pod \"certified-operators-4prm4\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.801339 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" event={"ID":"9f8b2880-f901-410a-a27e-60691016e54e","Type":"ContainerStarted","Data":"0eb9d44ab92e7b5b5bcbc132ed0b580aafe3a68ff96f8d04d735d70c83a0d1ef"} Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.801405 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" event={"ID":"9f8b2880-f901-410a-a27e-60691016e54e","Type":"ContainerStarted","Data":"c1d6d907ae9d07779051fe6ca1064f6131dc4601ec50f8091fc48df95e01457d"} Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.811491 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fr5gw" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.843152 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-utilities\") pod \"certified-operators-4prm4\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.843496 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9597\" (UniqueName: \"kubernetes.io/projected/a3734c39-129f-4b25-be9d-e2ca36e98de3-kube-api-access-s9597\") pod \"certified-operators-4prm4\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.843556 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-catalog-content\") pod \"certified-operators-4prm4\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.852247 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-utilities\") pod \"certified-operators-4prm4\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.852660 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-catalog-content\") pod \"certified-operators-4prm4\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.872575 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9597\" (UniqueName: \"kubernetes.io/projected/a3734c39-129f-4b25-be9d-e2ca36e98de3-kube-api-access-s9597\") pod \"certified-operators-4prm4\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.881191 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-w8gt5" podStartSLOduration=10.881174637 podStartE2EDuration="10.881174637s" podCreationTimestamp="2025-12-06 15:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:29.852537291 +0000 UTC m=+97.150548204" watchObservedRunningTime="2025-12-06 15:30:29.881174637 +0000 UTC m=+97.179185550" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.971936 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:30:29 crc kubenswrapper[4848]: I1206 15:30:29.992205 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.136972 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56mp5"] Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.140420 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7jzh"] Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.198096 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fgcg8"] Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.240417 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4prm4"] Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.455280 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.460224 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f6acd83-a70e-4a34-96a5-ea7bd9e95935-metrics-certs\") pod \"network-metrics-daemon-v4dm4\" (UID: \"0f6acd83-a70e-4a34-96a5-ea7bd9e95935\") " pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.637258 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:30 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:30 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:30 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.637343 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.642645 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4dm4" Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.827347 4848 generic.go:334] "Generic (PLEG): container finished" podID="a3734c39-129f-4b25-be9d-e2ca36e98de3" containerID="16d268e8a8e4ae1224ad387cdb7056858b1b33d9cd0c95b59d0aff3373f91f04" exitCode=0 Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.827406 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4prm4" event={"ID":"a3734c39-129f-4b25-be9d-e2ca36e98de3","Type":"ContainerDied","Data":"16d268e8a8e4ae1224ad387cdb7056858b1b33d9cd0c95b59d0aff3373f91f04"} Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.827473 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4prm4" event={"ID":"a3734c39-129f-4b25-be9d-e2ca36e98de3","Type":"ContainerStarted","Data":"e8ae6e8e5b44b11ccea05d5416f0bc90bfe63111dc4ce30c6aa2134de618af0d"} Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.830480 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.831056 4848 generic.go:334] "Generic (PLEG): container finished" podID="b5528992-4741-41f7-bd58-d7614c936639" containerID="8115f8f2caca2dfd06558fd76d395dbd151399c7ffc6417155cda481ce4085df" exitCode=0 Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.831093 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7jzh" event={"ID":"b5528992-4741-41f7-bd58-d7614c936639","Type":"ContainerDied","Data":"8115f8f2caca2dfd06558fd76d395dbd151399c7ffc6417155cda481ce4085df"} Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.831126 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7jzh" event={"ID":"b5528992-4741-41f7-bd58-d7614c936639","Type":"ContainerStarted","Data":"5926ee71ad6fb3c0862cf0841e363df320fff0bd90518eba1687ee56913763ad"} Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.833952 4848 generic.go:334] "Generic (PLEG): container finished" podID="c93d43de-aa47-4357-b786-aa586a35d462" containerID="b919732718c6d723a61617006601c8be1d483479c4cee3de766a99408297713f" exitCode=0 Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.834010 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgcg8" event={"ID":"c93d43de-aa47-4357-b786-aa586a35d462","Type":"ContainerDied","Data":"b919732718c6d723a61617006601c8be1d483479c4cee3de766a99408297713f"} Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.834031 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgcg8" event={"ID":"c93d43de-aa47-4357-b786-aa586a35d462","Type":"ContainerStarted","Data":"acb4de3c99a6a1dc885b8c531f6b21d7c0c30edbb3bb84a454c566c28eb58f07"} Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.839225 4848 generic.go:334] "Generic (PLEG): container finished" podID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" containerID="8cd6ee99a2c9aa7dabea46df0fa20b5d59acdafb4fa3555d1e9aa1f42913fcc8" exitCode=0 Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.839294 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzqt" event={"ID":"9aba59c4-d9e7-444b-9620-29a26fa4c9fb","Type":"ContainerDied","Data":"8cd6ee99a2c9aa7dabea46df0fa20b5d59acdafb4fa3555d1e9aa1f42913fcc8"} Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.839321 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzqt" event={"ID":"9aba59c4-d9e7-444b-9620-29a26fa4c9fb","Type":"ContainerStarted","Data":"87c6b6575b1e56cbf4852eeb17e1b3f7c67e2e55207fc5df0d38f9a76096cd50"} Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.852314 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" event={"ID":"f8e41ef6-75a7-4af2-94b0-14ef0274122a","Type":"ContainerStarted","Data":"8265df00fe4aca945806dea57fbf6818dca40e9e1706848e043daca7d2925df4"} Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.852361 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" event={"ID":"f8e41ef6-75a7-4af2-94b0-14ef0274122a","Type":"ContainerStarted","Data":"21cc8369ffc620dd2bd1be492d7304fe0d925eb2319ed13dd69f5ead0040fe4a"} Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.874355 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v4dm4"] Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.906539 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.906514462 podStartE2EDuration="1.906514462s" podCreationTimestamp="2025-12-06 15:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:30.878882234 +0000 UTC m=+98.176893147" watchObservedRunningTime="2025-12-06 15:30:30.906514462 +0000 UTC m=+98.204525385" Dec 06 15:30:30 crc kubenswrapper[4848]: W1206 15:30:30.926092 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f6acd83_a70e_4a34_96a5_ea7bd9e95935.slice/crio-0a1e147f08c397d19ed991d00281e5e9c7abf2338f7057d30c964cfa3d188cff WatchSource:0}: Error finding container 0a1e147f08c397d19ed991d00281e5e9c7abf2338f7057d30c964cfa3d188cff: Status 404 returned error can't find the container with id 0a1e147f08c397d19ed991d00281e5e9c7abf2338f7057d30c964cfa3d188cff Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.950674 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" podStartSLOduration=78.950659215 podStartE2EDuration="1m18.950659215s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:30.947812163 +0000 UTC m=+98.245823076" watchObservedRunningTime="2025-12-06 15:30:30.950659215 +0000 UTC m=+98.248670128" Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.983160 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.983733 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bdjbm"] Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.985683 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.988263 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdjbm"] Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.988269 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.998248 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 15:30:30 crc kubenswrapper[4848]: I1206 15:30:30.998876 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.000394 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.001313 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.006196 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.166400 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"21d34af6-7da7-4f33-a3e9-6aa5008922ef\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.166506 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txw96\" (UniqueName: \"kubernetes.io/projected/7029868d-509f-479f-a237-45715e8114e2-kube-api-access-txw96\") pod \"redhat-marketplace-bdjbm\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.166527 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-utilities\") pod \"redhat-marketplace-bdjbm\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.166551 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"21d34af6-7da7-4f33-a3e9-6aa5008922ef\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.166587 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-catalog-content\") pod \"redhat-marketplace-bdjbm\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.269325 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txw96\" (UniqueName: \"kubernetes.io/projected/7029868d-509f-479f-a237-45715e8114e2-kube-api-access-txw96\") pod \"redhat-marketplace-bdjbm\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.269377 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-utilities\") pod \"redhat-marketplace-bdjbm\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.269410 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"21d34af6-7da7-4f33-a3e9-6aa5008922ef\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.269440 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-catalog-content\") pod \"redhat-marketplace-bdjbm\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.269478 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"21d34af6-7da7-4f33-a3e9-6aa5008922ef\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.269780 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"21d34af6-7da7-4f33-a3e9-6aa5008922ef\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.271403 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-utilities\") pod \"redhat-marketplace-bdjbm\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.271391 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-catalog-content\") pod \"redhat-marketplace-bdjbm\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.288016 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txw96\" (UniqueName: \"kubernetes.io/projected/7029868d-509f-479f-a237-45715e8114e2-kube-api-access-txw96\") pod \"redhat-marketplace-bdjbm\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.289152 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"21d34af6-7da7-4f33-a3e9-6aa5008922ef\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.348010 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.358217 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.371516 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cl5sq"] Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.373120 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.380768 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cl5sq"] Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.461894 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.462312 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.472784 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-catalog-content\") pod \"redhat-marketplace-cl5sq\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.472946 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbfdq\" (UniqueName: \"kubernetes.io/projected/1f66c042-6bdd-4d2d-bd20-09521140274e-kube-api-access-rbfdq\") pod \"redhat-marketplace-cl5sq\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.472987 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-utilities\") pod \"redhat-marketplace-cl5sq\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.473785 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.552402 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.552456 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.560046 4848 patch_prober.go:28] interesting pod/console-f9d7485db-mdg75 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.560198 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mdg75" podUID="f250df39-ff33-455c-9edc-cb1997a8c782" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.574354 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-catalog-content\") pod \"redhat-marketplace-cl5sq\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.574523 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbfdq\" (UniqueName: \"kubernetes.io/projected/1f66c042-6bdd-4d2d-bd20-09521140274e-kube-api-access-rbfdq\") pod \"redhat-marketplace-cl5sq\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.574548 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-utilities\") pod \"redhat-marketplace-cl5sq\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.575318 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-catalog-content\") pod \"redhat-marketplace-cl5sq\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.577068 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-utilities\") pod \"redhat-marketplace-cl5sq\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.597890 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbfdq\" (UniqueName: \"kubernetes.io/projected/1f66c042-6bdd-4d2d-bd20-09521140274e-kube-api-access-rbfdq\") pod \"redhat-marketplace-cl5sq\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.617586 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdjbm"] Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.634465 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:31 crc kubenswrapper[4848]: W1206 15:30:31.636899 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7029868d_509f_479f_a237_45715e8114e2.slice/crio-66fa250d2c7617d27b44d5f9743ceabdff769d89768e851dc40c15b298a988ff WatchSource:0}: Error finding container 66fa250d2c7617d27b44d5f9743ceabdff769d89768e851dc40c15b298a988ff: Status 404 returned error can't find the container with id 66fa250d2c7617d27b44d5f9743ceabdff769d89768e851dc40c15b298a988ff Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.640515 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:31 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:31 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:31 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.640558 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.654523 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 15:30:31 crc kubenswrapper[4848]: W1206 15:30:31.679335 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod21d34af6_7da7_4f33_a3e9_6aa5008922ef.slice/crio-347abca5af5214dd49a062fff84d0087c34e3db9b7b631ecdf71ae82d90d4ca9 WatchSource:0}: Error finding container 347abca5af5214dd49a062fff84d0087c34e3db9b7b631ecdf71ae82d90d4ca9: Status 404 returned error can't find the container with id 347abca5af5214dd49a062fff84d0087c34e3db9b7b631ecdf71ae82d90d4ca9 Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.706984 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.808164 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.824207 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-ftn2g container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.824223 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-ftn2g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.824257 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ftn2g" podUID="1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.824276 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ftn2g" podUID="1cacda27-3fdb-45e6-8fc0-fa8c7f3cf26a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.836113 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.836197 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.844181 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.914386 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"21d34af6-7da7-4f33-a3e9-6aa5008922ef","Type":"ContainerStarted","Data":"347abca5af5214dd49a062fff84d0087c34e3db9b7b631ecdf71ae82d90d4ca9"} Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.932789 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdjbm" event={"ID":"7029868d-509f-479f-a237-45715e8114e2","Type":"ContainerStarted","Data":"66fa250d2c7617d27b44d5f9743ceabdff769d89768e851dc40c15b298a988ff"} Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.939182 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" event={"ID":"0f6acd83-a70e-4a34-96a5-ea7bd9e95935","Type":"ContainerStarted","Data":"0dab9330d00885a007a9bc6c6a3d34deef1c0100903dea3fb76597195ca69732"} Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.939235 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" event={"ID":"0f6acd83-a70e-4a34-96a5-ea7bd9e95935","Type":"ContainerStarted","Data":"9ae25981a79330422e019ab57f69b67e7649a45a09c525f42f17118f59554b20"} Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.939247 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v4dm4" event={"ID":"0f6acd83-a70e-4a34-96a5-ea7bd9e95935","Type":"ContainerStarted","Data":"0a1e147f08c397d19ed991d00281e5e9c7abf2338f7057d30c964cfa3d188cff"} Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.953765 4848 generic.go:334] "Generic (PLEG): container finished" podID="55109807-aa20-4021-8a9f-f40b4c91c2df" containerID="c95ca69fafaffd6980c89a1175c8306242247e0a20d5daa16092ccb41e3fdc8e" exitCode=0 Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.954884 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" event={"ID":"55109807-aa20-4021-8a9f-f40b4c91c2df","Type":"ContainerDied","Data":"c95ca69fafaffd6980c89a1175c8306242247e0a20d5daa16092ccb41e3fdc8e"} Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.957528 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.959926 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-v4dm4" podStartSLOduration=79.959911655 podStartE2EDuration="1m19.959911655s" podCreationTimestamp="2025-12-06 15:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:30:31.95556833 +0000 UTC m=+99.253579243" watchObservedRunningTime="2025-12-06 15:30:31.959911655 +0000 UTC m=+99.257922568" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.969794 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-cjw6f" Dec 06 15:30:31 crc kubenswrapper[4848]: I1206 15:30:31.973594 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kn6kp" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.064555 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cl5sq"] Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.190268 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hmm2g"] Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.197138 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.200616 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmm2g"] Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.200783 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.200916 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-catalog-content\") pod \"redhat-operators-hmm2g\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.200965 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc6vn\" (UniqueName: \"kubernetes.io/projected/2913f1de-4117-470b-b1de-a876051a131c-kube-api-access-bc6vn\") pod \"redhat-operators-hmm2g\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.200990 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-utilities\") pod \"redhat-operators-hmm2g\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.301836 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-catalog-content\") pod \"redhat-operators-hmm2g\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.301901 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc6vn\" (UniqueName: \"kubernetes.io/projected/2913f1de-4117-470b-b1de-a876051a131c-kube-api-access-bc6vn\") pod \"redhat-operators-hmm2g\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.301928 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-utilities\") pod \"redhat-operators-hmm2g\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.302382 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-utilities\") pod \"redhat-operators-hmm2g\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.302780 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-catalog-content\") pod \"redhat-operators-hmm2g\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.328732 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc6vn\" (UniqueName: \"kubernetes.io/projected/2913f1de-4117-470b-b1de-a876051a131c-kube-api-access-bc6vn\") pod \"redhat-operators-hmm2g\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.570952 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.585566 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qxp7g"] Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.586842 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.606881 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxp7g"] Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.609716 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-utilities\") pod \"redhat-operators-qxp7g\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.609759 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjkpj\" (UniqueName: \"kubernetes.io/projected/bc5a426c-df18-43db-b818-0ae2977b7373-kube-api-access-hjkpj\") pod \"redhat-operators-qxp7g\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.609778 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-catalog-content\") pod \"redhat-operators-qxp7g\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.630056 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.630719 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.635032 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.645395 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:32 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:32 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:32 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.645461 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.646075 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.665247 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.714799 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc71f12a-8807-41aa-8bdf-870a9494cd10-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc71f12a-8807-41aa-8bdf-870a9494cd10\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.714857 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-utilities\") pod \"redhat-operators-qxp7g\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.714883 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjkpj\" (UniqueName: \"kubernetes.io/projected/bc5a426c-df18-43db-b818-0ae2977b7373-kube-api-access-hjkpj\") pod \"redhat-operators-qxp7g\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.714901 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-catalog-content\") pod \"redhat-operators-qxp7g\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.715000 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc71f12a-8807-41aa-8bdf-870a9494cd10-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc71f12a-8807-41aa-8bdf-870a9494cd10\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.715537 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-utilities\") pod \"redhat-operators-qxp7g\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.716018 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-catalog-content\") pod \"redhat-operators-qxp7g\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.747420 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjkpj\" (UniqueName: \"kubernetes.io/projected/bc5a426c-df18-43db-b818-0ae2977b7373-kube-api-access-hjkpj\") pod \"redhat-operators-qxp7g\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.815725 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc71f12a-8807-41aa-8bdf-870a9494cd10-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc71f12a-8807-41aa-8bdf-870a9494cd10\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.815764 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc71f12a-8807-41aa-8bdf-870a9494cd10-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc71f12a-8807-41aa-8bdf-870a9494cd10\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.815824 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc71f12a-8807-41aa-8bdf-870a9494cd10-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc71f12a-8807-41aa-8bdf-870a9494cd10\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.836835 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc71f12a-8807-41aa-8bdf-870a9494cd10-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc71f12a-8807-41aa-8bdf-870a9494cd10\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.915574 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmm2g"] Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.938098 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.988545 4848 generic.go:334] "Generic (PLEG): container finished" podID="1f66c042-6bdd-4d2d-bd20-09521140274e" containerID="a83cacdfb0666d05e094db1d6786f34d4d5f68255230b46cbd0d31b86e0c7410" exitCode=0 Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.995566 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cl5sq" event={"ID":"1f66c042-6bdd-4d2d-bd20-09521140274e","Type":"ContainerDied","Data":"a83cacdfb0666d05e094db1d6786f34d4d5f68255230b46cbd0d31b86e0c7410"} Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.995596 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cl5sq" event={"ID":"1f66c042-6bdd-4d2d-bd20-09521140274e","Type":"ContainerStarted","Data":"61fe1fb27429d09d1225cf1e985145e529e2f0a0f5141c9b492268b74cd257f0"} Dec 06 15:30:32 crc kubenswrapper[4848]: I1206 15:30:32.995803 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.003900 4848 generic.go:334] "Generic (PLEG): container finished" podID="7029868d-509f-479f-a237-45715e8114e2" containerID="15a125363743de990e503b8665b93724b7c9158497614a869ff54a9c260489d0" exitCode=0 Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.003990 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdjbm" event={"ID":"7029868d-509f-479f-a237-45715e8114e2","Type":"ContainerDied","Data":"15a125363743de990e503b8665b93724b7c9158497614a869ff54a9c260489d0"} Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.016896 4848 generic.go:334] "Generic (PLEG): container finished" podID="21d34af6-7da7-4f33-a3e9-6aa5008922ef" containerID="77d2ed3b55743b4e00b87bec81850db65248b690094b179a20ccf7f06ddb8b7b" exitCode=0 Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.016977 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"21d34af6-7da7-4f33-a3e9-6aa5008922ef","Type":"ContainerDied","Data":"77d2ed3b55743b4e00b87bec81850db65248b690094b179a20ccf7f06ddb8b7b"} Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.021051 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmm2g" event={"ID":"2913f1de-4117-470b-b1de-a876051a131c","Type":"ContainerStarted","Data":"e3b76a0ce93fe699afca1077da7a34862443d46875c1b4aeb10bc3777ea36109"} Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.354591 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.448089 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxp7g"] Dec 06 15:30:33 crc kubenswrapper[4848]: W1206 15:30:33.453121 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc5a426c_df18_43db_b818_0ae2977b7373.slice/crio-29c8506ad8a99d2050c3295be0f8e93dcc55d3a726a0ada7bd62becade521185 WatchSource:0}: Error finding container 29c8506ad8a99d2050c3295be0f8e93dcc55d3a726a0ada7bd62becade521185: Status 404 returned error can't find the container with id 29c8506ad8a99d2050c3295be0f8e93dcc55d3a726a0ada7bd62becade521185 Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.535318 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55109807-aa20-4021-8a9f-f40b4c91c2df-config-volume\") pod \"55109807-aa20-4021-8a9f-f40b4c91c2df\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.535429 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55109807-aa20-4021-8a9f-f40b4c91c2df-secret-volume\") pod \"55109807-aa20-4021-8a9f-f40b4c91c2df\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.535452 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwmlb\" (UniqueName: \"kubernetes.io/projected/55109807-aa20-4021-8a9f-f40b4c91c2df-kube-api-access-hwmlb\") pod \"55109807-aa20-4021-8a9f-f40b4c91c2df\" (UID: \"55109807-aa20-4021-8a9f-f40b4c91c2df\") " Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.536758 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55109807-aa20-4021-8a9f-f40b4c91c2df-config-volume" (OuterVolumeSpecName: "config-volume") pod "55109807-aa20-4021-8a9f-f40b4c91c2df" (UID: "55109807-aa20-4021-8a9f-f40b4c91c2df"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.562130 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55109807-aa20-4021-8a9f-f40b4c91c2df-kube-api-access-hwmlb" (OuterVolumeSpecName: "kube-api-access-hwmlb") pod "55109807-aa20-4021-8a9f-f40b4c91c2df" (UID: "55109807-aa20-4021-8a9f-f40b4c91c2df"). InnerVolumeSpecName "kube-api-access-hwmlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.564060 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55109807-aa20-4021-8a9f-f40b4c91c2df-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "55109807-aa20-4021-8a9f-f40b4c91c2df" (UID: "55109807-aa20-4021-8a9f-f40b4c91c2df"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:30:33 crc kubenswrapper[4848]: W1206 15:30:33.589102 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcc71f12a_8807_41aa_8bdf_870a9494cd10.slice/crio-c231aafbfe8a0f34b680235c0c971783169e75d51b8dbc39cf2bbb5966b8be35 WatchSource:0}: Error finding container c231aafbfe8a0f34b680235c0c971783169e75d51b8dbc39cf2bbb5966b8be35: Status 404 returned error can't find the container with id c231aafbfe8a0f34b680235c0c971783169e75d51b8dbc39cf2bbb5966b8be35 Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.595449 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.636568 4848 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55109807-aa20-4021-8a9f-f40b4c91c2df-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.636924 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwmlb\" (UniqueName: \"kubernetes.io/projected/55109807-aa20-4021-8a9f-f40b4c91c2df-kube-api-access-hwmlb\") on node \"crc\" DevicePath \"\"" Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.636941 4848 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55109807-aa20-4021-8a9f-f40b4c91c2df-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.642028 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:33 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:33 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:33 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:33 crc kubenswrapper[4848]: I1206 15:30:33.642101 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.056955 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc71f12a-8807-41aa-8bdf-870a9494cd10","Type":"ContainerStarted","Data":"c231aafbfe8a0f34b680235c0c971783169e75d51b8dbc39cf2bbb5966b8be35"} Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.061105 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" event={"ID":"55109807-aa20-4021-8a9f-f40b4c91c2df","Type":"ContainerDied","Data":"1e329708038c6e88a199ebedb218c89acee49d960b9b3ad79aa8d9ceae139cfb"} Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.061133 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417250-tqwpk" Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.061144 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e329708038c6e88a199ebedb218c89acee49d960b9b3ad79aa8d9ceae139cfb" Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.089905 4848 generic.go:334] "Generic (PLEG): container finished" podID="2913f1de-4117-470b-b1de-a876051a131c" containerID="01921c4e8b70059923fb7ad88000a9a5aa060169049d0fbe1788beec84574e4d" exitCode=0 Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.090033 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmm2g" event={"ID":"2913f1de-4117-470b-b1de-a876051a131c","Type":"ContainerDied","Data":"01921c4e8b70059923fb7ad88000a9a5aa060169049d0fbe1788beec84574e4d"} Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.103090 4848 generic.go:334] "Generic (PLEG): container finished" podID="bc5a426c-df18-43db-b818-0ae2977b7373" containerID="7a8b7c52023c21739ead434bb8b7bc43cda985188c0acc74c7ae53e005aebba4" exitCode=0 Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.104150 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxp7g" event={"ID":"bc5a426c-df18-43db-b818-0ae2977b7373","Type":"ContainerDied","Data":"7a8b7c52023c21739ead434bb8b7bc43cda985188c0acc74c7ae53e005aebba4"} Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.104177 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxp7g" event={"ID":"bc5a426c-df18-43db-b818-0ae2977b7373","Type":"ContainerStarted","Data":"29c8506ad8a99d2050c3295be0f8e93dcc55d3a726a0ada7bd62becade521185"} Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.340211 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.448506 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kubelet-dir\") pod \"21d34af6-7da7-4f33-a3e9-6aa5008922ef\" (UID: \"21d34af6-7da7-4f33-a3e9-6aa5008922ef\") " Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.448572 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kube-api-access\") pod \"21d34af6-7da7-4f33-a3e9-6aa5008922ef\" (UID: \"21d34af6-7da7-4f33-a3e9-6aa5008922ef\") " Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.449159 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "21d34af6-7da7-4f33-a3e9-6aa5008922ef" (UID: "21d34af6-7da7-4f33-a3e9-6aa5008922ef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.455393 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "21d34af6-7da7-4f33-a3e9-6aa5008922ef" (UID: "21d34af6-7da7-4f33-a3e9-6aa5008922ef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.550218 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.550249 4848 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d34af6-7da7-4f33-a3e9-6aa5008922ef-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.636919 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:34 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:34 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:34 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:34 crc kubenswrapper[4848]: I1206 15:30:34.636976 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:35 crc kubenswrapper[4848]: I1206 15:30:35.137501 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"21d34af6-7da7-4f33-a3e9-6aa5008922ef","Type":"ContainerDied","Data":"347abca5af5214dd49a062fff84d0087c34e3db9b7b631ecdf71ae82d90d4ca9"} Dec 06 15:30:35 crc kubenswrapper[4848]: I1206 15:30:35.137561 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="347abca5af5214dd49a062fff84d0087c34e3db9b7b631ecdf71ae82d90d4ca9" Dec 06 15:30:35 crc kubenswrapper[4848]: I1206 15:30:35.137526 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 15:30:35 crc kubenswrapper[4848]: I1206 15:30:35.140209 4848 generic.go:334] "Generic (PLEG): container finished" podID="cc71f12a-8807-41aa-8bdf-870a9494cd10" containerID="c042f05fd195091d8f7458c2c41c989a0896e4c15fd689ecd14e33fc519f7dc5" exitCode=0 Dec 06 15:30:35 crc kubenswrapper[4848]: I1206 15:30:35.140249 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc71f12a-8807-41aa-8bdf-870a9494cd10","Type":"ContainerDied","Data":"c042f05fd195091d8f7458c2c41c989a0896e4c15fd689ecd14e33fc519f7dc5"} Dec 06 15:30:35 crc kubenswrapper[4848]: I1206 15:30:35.635665 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:35 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:35 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:35 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:35 crc kubenswrapper[4848]: I1206 15:30:35.635748 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:36 crc kubenswrapper[4848]: I1206 15:30:36.458292 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 15:30:36 crc kubenswrapper[4848]: I1206 15:30:36.587168 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc71f12a-8807-41aa-8bdf-870a9494cd10-kube-api-access\") pod \"cc71f12a-8807-41aa-8bdf-870a9494cd10\" (UID: \"cc71f12a-8807-41aa-8bdf-870a9494cd10\") " Dec 06 15:30:36 crc kubenswrapper[4848]: I1206 15:30:36.587262 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc71f12a-8807-41aa-8bdf-870a9494cd10-kubelet-dir\") pod \"cc71f12a-8807-41aa-8bdf-870a9494cd10\" (UID: \"cc71f12a-8807-41aa-8bdf-870a9494cd10\") " Dec 06 15:30:36 crc kubenswrapper[4848]: I1206 15:30:36.587394 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc71f12a-8807-41aa-8bdf-870a9494cd10-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cc71f12a-8807-41aa-8bdf-870a9494cd10" (UID: "cc71f12a-8807-41aa-8bdf-870a9494cd10"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:30:36 crc kubenswrapper[4848]: I1206 15:30:36.587752 4848 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc71f12a-8807-41aa-8bdf-870a9494cd10-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 15:30:36 crc kubenswrapper[4848]: I1206 15:30:36.607120 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc71f12a-8807-41aa-8bdf-870a9494cd10-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cc71f12a-8807-41aa-8bdf-870a9494cd10" (UID: "cc71f12a-8807-41aa-8bdf-870a9494cd10"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:30:36 crc kubenswrapper[4848]: I1206 15:30:36.636017 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:36 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:36 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:36 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:36 crc kubenswrapper[4848]: I1206 15:30:36.636136 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:36 crc kubenswrapper[4848]: I1206 15:30:36.689014 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc71f12a-8807-41aa-8bdf-870a9494cd10-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 15:30:37 crc kubenswrapper[4848]: I1206 15:30:37.179167 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 15:30:37 crc kubenswrapper[4848]: I1206 15:30:37.179004 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc71f12a-8807-41aa-8bdf-870a9494cd10","Type":"ContainerDied","Data":"c231aafbfe8a0f34b680235c0c971783169e75d51b8dbc39cf2bbb5966b8be35"} Dec 06 15:30:37 crc kubenswrapper[4848]: I1206 15:30:37.180467 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c231aafbfe8a0f34b680235c0c971783169e75d51b8dbc39cf2bbb5966b8be35" Dec 06 15:30:37 crc kubenswrapper[4848]: I1206 15:30:37.635842 4848 patch_prober.go:28] interesting pod/router-default-5444994796-szkmh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 15:30:37 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Dec 06 15:30:37 crc kubenswrapper[4848]: [+]process-running ok Dec 06 15:30:37 crc kubenswrapper[4848]: healthz check failed Dec 06 15:30:37 crc kubenswrapper[4848]: I1206 15:30:37.635917 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-szkmh" podUID="ca40b01e-331c-4d2e-908a-f25b7b7b40e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:30:37 crc kubenswrapper[4848]: I1206 15:30:37.789416 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zq7wp" Dec 06 15:30:38 crc kubenswrapper[4848]: I1206 15:30:38.635832 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:38 crc kubenswrapper[4848]: I1206 15:30:38.638544 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-szkmh" Dec 06 15:30:41 crc kubenswrapper[4848]: I1206 15:30:41.552239 4848 patch_prober.go:28] interesting pod/console-f9d7485db-mdg75 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Dec 06 15:30:41 crc kubenswrapper[4848]: I1206 15:30:41.552299 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mdg75" podUID="f250df39-ff33-455c-9edc-cb1997a8c782" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Dec 06 15:30:41 crc kubenswrapper[4848]: I1206 15:30:41.837923 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ftn2g" Dec 06 15:30:49 crc kubenswrapper[4848]: I1206 15:30:49.495520 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:30:51 crc kubenswrapper[4848]: I1206 15:30:51.556278 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:30:51 crc kubenswrapper[4848]: I1206 15:30:51.560719 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:31:02 crc kubenswrapper[4848]: I1206 15:31:02.690628 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5fpmg" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.223907 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 15:31:08 crc kubenswrapper[4848]: E1206 15:31:08.224630 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55109807-aa20-4021-8a9f-f40b4c91c2df" containerName="collect-profiles" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.224643 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="55109807-aa20-4021-8a9f-f40b4c91c2df" containerName="collect-profiles" Dec 06 15:31:08 crc kubenswrapper[4848]: E1206 15:31:08.224659 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d34af6-7da7-4f33-a3e9-6aa5008922ef" containerName="pruner" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.224665 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d34af6-7da7-4f33-a3e9-6aa5008922ef" containerName="pruner" Dec 06 15:31:08 crc kubenswrapper[4848]: E1206 15:31:08.224677 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc71f12a-8807-41aa-8bdf-870a9494cd10" containerName="pruner" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.224684 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc71f12a-8807-41aa-8bdf-870a9494cd10" containerName="pruner" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.224812 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d34af6-7da7-4f33-a3e9-6aa5008922ef" containerName="pruner" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.224827 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="55109807-aa20-4021-8a9f-f40b4c91c2df" containerName="collect-profiles" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.224838 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc71f12a-8807-41aa-8bdf-870a9494cd10" containerName="pruner" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.225231 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.227815 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.227857 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.229672 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.254752 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c531794-18cc-4c23-b373-d6628e2363ff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4c531794-18cc-4c23-b373-d6628e2363ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.254902 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c531794-18cc-4c23-b373-d6628e2363ff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4c531794-18cc-4c23-b373-d6628e2363ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.355995 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c531794-18cc-4c23-b373-d6628e2363ff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4c531794-18cc-4c23-b373-d6628e2363ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.356075 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c531794-18cc-4c23-b373-d6628e2363ff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4c531794-18cc-4c23-b373-d6628e2363ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.356165 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c531794-18cc-4c23-b373-d6628e2363ff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4c531794-18cc-4c23-b373-d6628e2363ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.375468 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c531794-18cc-4c23-b373-d6628e2363ff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4c531794-18cc-4c23-b373-d6628e2363ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 15:31:08 crc kubenswrapper[4848]: I1206 15:31:08.562406 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 15:31:09 crc kubenswrapper[4848]: E1206 15:31:09.820594 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 06 15:31:09 crc kubenswrapper[4848]: E1206 15:31:09.820759 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vxpxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-n7jzh_openshift-marketplace(b5528992-4741-41f7-bd58-d7614c936639): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 15:31:09 crc kubenswrapper[4848]: E1206 15:31:09.821909 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-n7jzh" podUID="b5528992-4741-41f7-bd58-d7614c936639" Dec 06 15:31:12 crc kubenswrapper[4848]: E1206 15:31:12.287546 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-n7jzh" podUID="b5528992-4741-41f7-bd58-d7614c936639" Dec 06 15:31:12 crc kubenswrapper[4848]: E1206 15:31:12.350445 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 06 15:31:12 crc kubenswrapper[4848]: E1206 15:31:12.350572 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-txw96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bdjbm_openshift-marketplace(7029868d-509f-479f-a237-45715e8114e2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 15:31:12 crc kubenswrapper[4848]: E1206 15:31:12.351964 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bdjbm" podUID="7029868d-509f-479f-a237-45715e8114e2" Dec 06 15:31:12 crc kubenswrapper[4848]: E1206 15:31:12.356239 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 06 15:31:12 crc kubenswrapper[4848]: E1206 15:31:12.356317 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rbfdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-cl5sq_openshift-marketplace(1f66c042-6bdd-4d2d-bd20-09521140274e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 15:31:12 crc kubenswrapper[4848]: E1206 15:31:12.357797 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-cl5sq" podUID="1f66c042-6bdd-4d2d-bd20-09521140274e" Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.813789 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.814459 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.821562 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.833115 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-var-lock\") pod \"installer-9-crc\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.833195 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kube-api-access\") pod \"installer-9-crc\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.833311 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kubelet-dir\") pod \"installer-9-crc\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.934649 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-var-lock\") pod \"installer-9-crc\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.934708 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kube-api-access\") pod \"installer-9-crc\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.934805 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-var-lock\") pod \"installer-9-crc\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.934866 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kubelet-dir\") pod \"installer-9-crc\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.934955 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kubelet-dir\") pod \"installer-9-crc\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:13 crc kubenswrapper[4848]: I1206 15:31:13.956476 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kube-api-access\") pod \"installer-9-crc\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:14 crc kubenswrapper[4848]: I1206 15:31:14.147431 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:15 crc kubenswrapper[4848]: E1206 15:31:15.031192 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bdjbm" podUID="7029868d-509f-479f-a237-45715e8114e2" Dec 06 15:31:15 crc kubenswrapper[4848]: E1206 15:31:15.031205 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-cl5sq" podUID="1f66c042-6bdd-4d2d-bd20-09521140274e" Dec 06 15:31:15 crc kubenswrapper[4848]: E1206 15:31:15.110095 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 06 15:31:15 crc kubenswrapper[4848]: E1206 15:31:15.110243 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-znfpr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lgzqt_openshift-marketplace(9aba59c4-d9e7-444b-9620-29a26fa4c9fb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 15:31:15 crc kubenswrapper[4848]: E1206 15:31:15.111406 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lgzqt" podUID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" Dec 06 15:31:15 crc kubenswrapper[4848]: E1206 15:31:15.115750 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 06 15:31:15 crc kubenswrapper[4848]: E1206 15:31:15.115869 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hjkpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qxp7g_openshift-marketplace(bc5a426c-df18-43db-b818-0ae2977b7373): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 15:31:15 crc kubenswrapper[4848]: E1206 15:31:15.116923 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qxp7g" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" Dec 06 15:31:15 crc kubenswrapper[4848]: E1206 15:31:15.147387 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 06 15:31:15 crc kubenswrapper[4848]: E1206 15:31:15.147510 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bc6vn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hmm2g_openshift-marketplace(2913f1de-4117-470b-b1de-a876051a131c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 15:31:15 crc kubenswrapper[4848]: E1206 15:31:15.149235 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hmm2g" podUID="2913f1de-4117-470b-b1de-a876051a131c" Dec 06 15:31:16 crc kubenswrapper[4848]: E1206 15:31:16.479055 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lgzqt" podUID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" Dec 06 15:31:16 crc kubenswrapper[4848]: E1206 15:31:16.479079 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qxp7g" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" Dec 06 15:31:16 crc kubenswrapper[4848]: E1206 15:31:16.479195 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hmm2g" podUID="2913f1de-4117-470b-b1de-a876051a131c" Dec 06 15:31:16 crc kubenswrapper[4848]: E1206 15:31:16.564411 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 06 15:31:16 crc kubenswrapper[4848]: E1206 15:31:16.564816 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2tfq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-fgcg8_openshift-marketplace(c93d43de-aa47-4357-b786-aa586a35d462): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 15:31:16 crc kubenswrapper[4848]: E1206 15:31:16.566009 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-fgcg8" podUID="c93d43de-aa47-4357-b786-aa586a35d462" Dec 06 15:31:16 crc kubenswrapper[4848]: E1206 15:31:16.573249 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 06 15:31:16 crc kubenswrapper[4848]: E1206 15:31:16.573384 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s9597,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-4prm4_openshift-marketplace(a3734c39-129f-4b25-be9d-e2ca36e98de3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 15:31:16 crc kubenswrapper[4848]: E1206 15:31:16.574674 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-4prm4" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" Dec 06 15:31:16 crc kubenswrapper[4848]: I1206 15:31:16.885275 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 15:31:16 crc kubenswrapper[4848]: W1206 15:31:16.942761 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4c531794_18cc_4c23_b373_d6628e2363ff.slice/crio-83b30aaacbce3cb8ad97ed212917b4848420cdb42467d02a7bda087116996dd1 WatchSource:0}: Error finding container 83b30aaacbce3cb8ad97ed212917b4848420cdb42467d02a7bda087116996dd1: Status 404 returned error can't find the container with id 83b30aaacbce3cb8ad97ed212917b4848420cdb42467d02a7bda087116996dd1 Dec 06 15:31:16 crc kubenswrapper[4848]: I1206 15:31:16.968846 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 15:31:17 crc kubenswrapper[4848]: I1206 15:31:17.150639 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:31:17 crc kubenswrapper[4848]: I1206 15:31:17.151037 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:31:17 crc kubenswrapper[4848]: I1206 15:31:17.421493 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4c531794-18cc-4c23-b373-d6628e2363ff","Type":"ContainerStarted","Data":"87fbbf1252cb8a6886e811bb425f50dc2c3ca27b13f95d9e81d593e8bc08dbf6"} Dec 06 15:31:17 crc kubenswrapper[4848]: I1206 15:31:17.421674 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4c531794-18cc-4c23-b373-d6628e2363ff","Type":"ContainerStarted","Data":"83b30aaacbce3cb8ad97ed212917b4848420cdb42467d02a7bda087116996dd1"} Dec 06 15:31:17 crc kubenswrapper[4848]: I1206 15:31:17.423813 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"88e145b9-64b0-44cc-8eb1-ce5d48791f20","Type":"ContainerStarted","Data":"220a4573edc1d3b83b24665ea7ce557af87bc05db4106018e0c60024bb98dc48"} Dec 06 15:31:17 crc kubenswrapper[4848]: I1206 15:31:17.423883 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"88e145b9-64b0-44cc-8eb1-ce5d48791f20","Type":"ContainerStarted","Data":"c3289b7b8b9bac2b471a06d7137da44a0fed6216740b29be72fec57a12b99182"} Dec 06 15:31:17 crc kubenswrapper[4848]: E1206 15:31:17.426099 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-4prm4" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" Dec 06 15:31:17 crc kubenswrapper[4848]: E1206 15:31:17.426099 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-fgcg8" podUID="c93d43de-aa47-4357-b786-aa586a35d462" Dec 06 15:31:17 crc kubenswrapper[4848]: I1206 15:31:17.438154 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=9.43813625 podStartE2EDuration="9.43813625s" podCreationTimestamp="2025-12-06 15:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:31:17.432285525 +0000 UTC m=+144.730296438" watchObservedRunningTime="2025-12-06 15:31:17.43813625 +0000 UTC m=+144.736147163" Dec 06 15:31:17 crc kubenswrapper[4848]: I1206 15:31:17.457740 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.457724948 podStartE2EDuration="4.457724948s" podCreationTimestamp="2025-12-06 15:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:31:17.454509713 +0000 UTC m=+144.752520626" watchObservedRunningTime="2025-12-06 15:31:17.457724948 +0000 UTC m=+144.755735861" Dec 06 15:31:18 crc kubenswrapper[4848]: I1206 15:31:18.430992 4848 generic.go:334] "Generic (PLEG): container finished" podID="4c531794-18cc-4c23-b373-d6628e2363ff" containerID="87fbbf1252cb8a6886e811bb425f50dc2c3ca27b13f95d9e81d593e8bc08dbf6" exitCode=0 Dec 06 15:31:18 crc kubenswrapper[4848]: I1206 15:31:18.431079 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4c531794-18cc-4c23-b373-d6628e2363ff","Type":"ContainerDied","Data":"87fbbf1252cb8a6886e811bb425f50dc2c3ca27b13f95d9e81d593e8bc08dbf6"} Dec 06 15:31:18 crc kubenswrapper[4848]: I1206 15:31:18.895477 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:31:18 crc kubenswrapper[4848]: I1206 15:31:18.895549 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:31:18 crc kubenswrapper[4848]: I1206 15:31:18.897352 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 06 15:31:18 crc kubenswrapper[4848]: I1206 15:31:18.897547 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 06 15:31:18 crc kubenswrapper[4848]: I1206 15:31:18.907419 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:31:18 crc kubenswrapper[4848]: I1206 15:31:18.912978 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:31:18 crc kubenswrapper[4848]: I1206 15:31:18.996357 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:31:18 crc kubenswrapper[4848]: I1206 15:31:18.996639 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:31:18 crc kubenswrapper[4848]: I1206 15:31:18.998485 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.008477 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.020778 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.021816 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.209688 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.240283 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.251316 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:31:19 crc kubenswrapper[4848]: W1206 15:31:19.528200 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-7e6dacfea0f4ede47d762f2e32becd6b475a5f5d590758e4fad6835e69fe5cd6 WatchSource:0}: Error finding container 7e6dacfea0f4ede47d762f2e32becd6b475a5f5d590758e4fad6835e69fe5cd6: Status 404 returned error can't find the container with id 7e6dacfea0f4ede47d762f2e32becd6b475a5f5d590758e4fad6835e69fe5cd6 Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.642634 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 15:31:19 crc kubenswrapper[4848]: W1206 15:31:19.807403 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-bdf92bd4dca7af16d26dccd1f2a40b61f7d6271110fb5f45ddb7f93761379530 WatchSource:0}: Error finding container bdf92bd4dca7af16d26dccd1f2a40b61f7d6271110fb5f45ddb7f93761379530: Status 404 returned error can't find the container with id bdf92bd4dca7af16d26dccd1f2a40b61f7d6271110fb5f45ddb7f93761379530 Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.808379 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c531794-18cc-4c23-b373-d6628e2363ff-kube-api-access\") pod \"4c531794-18cc-4c23-b373-d6628e2363ff\" (UID: \"4c531794-18cc-4c23-b373-d6628e2363ff\") " Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.808454 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c531794-18cc-4c23-b373-d6628e2363ff-kubelet-dir\") pod \"4c531794-18cc-4c23-b373-d6628e2363ff\" (UID: \"4c531794-18cc-4c23-b373-d6628e2363ff\") " Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.808729 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c531794-18cc-4c23-b373-d6628e2363ff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4c531794-18cc-4c23-b373-d6628e2363ff" (UID: "4c531794-18cc-4c23-b373-d6628e2363ff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.812771 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c531794-18cc-4c23-b373-d6628e2363ff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4c531794-18cc-4c23-b373-d6628e2363ff" (UID: "4c531794-18cc-4c23-b373-d6628e2363ff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.910351 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c531794-18cc-4c23-b373-d6628e2363ff-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:19 crc kubenswrapper[4848]: I1206 15:31:19.910637 4848 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c531794-18cc-4c23-b373-d6628e2363ff-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:20 crc kubenswrapper[4848]: I1206 15:31:20.445467 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b948382b8dc1c84fbf063c467c2fb8d9a0d1def0f78d070cbf5cc12422847929"} Dec 06 15:31:20 crc kubenswrapper[4848]: I1206 15:31:20.445754 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2ed94acdf682c8ffa3ed2e9a0072303fc68248e136a72666c0fef18ee78d48ce"} Dec 06 15:31:20 crc kubenswrapper[4848]: I1206 15:31:20.447081 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3e7492b869f3b00d1bc8fe22b094ce30f18be54ef3cd7ef2a13381cf5cbdf00c"} Dec 06 15:31:20 crc kubenswrapper[4848]: I1206 15:31:20.447104 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"bdf92bd4dca7af16d26dccd1f2a40b61f7d6271110fb5f45ddb7f93761379530"} Dec 06 15:31:20 crc kubenswrapper[4848]: I1206 15:31:20.447300 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:31:20 crc kubenswrapper[4848]: I1206 15:31:20.448797 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"00e823ca0cbc2849a50b4a884e7a7042413259afc56d16fb238c26dbb0f67f73"} Dec 06 15:31:20 crc kubenswrapper[4848]: I1206 15:31:20.448848 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7e6dacfea0f4ede47d762f2e32becd6b475a5f5d590758e4fad6835e69fe5cd6"} Dec 06 15:31:20 crc kubenswrapper[4848]: I1206 15:31:20.450908 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4c531794-18cc-4c23-b373-d6628e2363ff","Type":"ContainerDied","Data":"83b30aaacbce3cb8ad97ed212917b4848420cdb42467d02a7bda087116996dd1"} Dec 06 15:31:20 crc kubenswrapper[4848]: I1206 15:31:20.451016 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83b30aaacbce3cb8ad97ed212917b4848420cdb42467d02a7bda087116996dd1" Dec 06 15:31:20 crc kubenswrapper[4848]: I1206 15:31:20.450952 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 15:31:24 crc kubenswrapper[4848]: I1206 15:31:24.472275 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7jzh" event={"ID":"b5528992-4741-41f7-bd58-d7614c936639","Type":"ContainerStarted","Data":"25f770acbca745f3a7799825f9d0baddab03c1d0c139d7ff75d02facdc94a4de"} Dec 06 15:31:25 crc kubenswrapper[4848]: I1206 15:31:25.488059 4848 generic.go:334] "Generic (PLEG): container finished" podID="b5528992-4741-41f7-bd58-d7614c936639" containerID="25f770acbca745f3a7799825f9d0baddab03c1d0c139d7ff75d02facdc94a4de" exitCode=0 Dec 06 15:31:25 crc kubenswrapper[4848]: I1206 15:31:25.488136 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7jzh" event={"ID":"b5528992-4741-41f7-bd58-d7614c936639","Type":"ContainerDied","Data":"25f770acbca745f3a7799825f9d0baddab03c1d0c139d7ff75d02facdc94a4de"} Dec 06 15:31:26 crc kubenswrapper[4848]: I1206 15:31:26.494498 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7jzh" event={"ID":"b5528992-4741-41f7-bd58-d7614c936639","Type":"ContainerStarted","Data":"fc2ffe5b0e907267c47a9f0974d70a07c5b8313088874bdec55e3864c3f69132"} Dec 06 15:31:26 crc kubenswrapper[4848]: I1206 15:31:26.516150 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n7jzh" podStartSLOduration=2.381503773 podStartE2EDuration="57.516130496s" podCreationTimestamp="2025-12-06 15:30:29 +0000 UTC" firstStartedPulling="2025-12-06 15:30:30.832753093 +0000 UTC m=+98.130763996" lastFinishedPulling="2025-12-06 15:31:25.967379806 +0000 UTC m=+153.265390719" observedRunningTime="2025-12-06 15:31:26.512443758 +0000 UTC m=+153.810454671" watchObservedRunningTime="2025-12-06 15:31:26.516130496 +0000 UTC m=+153.814141399" Dec 06 15:31:27 crc kubenswrapper[4848]: I1206 15:31:27.502534 4848 generic.go:334] "Generic (PLEG): container finished" podID="1f66c042-6bdd-4d2d-bd20-09521140274e" containerID="a3b56688841b7e2e00b7cf076ea4e0ac35c6408bb0b93cb6a80bc3576d87c76c" exitCode=0 Dec 06 15:31:27 crc kubenswrapper[4848]: I1206 15:31:27.502635 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cl5sq" event={"ID":"1f66c042-6bdd-4d2d-bd20-09521140274e","Type":"ContainerDied","Data":"a3b56688841b7e2e00b7cf076ea4e0ac35c6408bb0b93cb6a80bc3576d87c76c"} Dec 06 15:31:28 crc kubenswrapper[4848]: I1206 15:31:28.508787 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cl5sq" event={"ID":"1f66c042-6bdd-4d2d-bd20-09521140274e","Type":"ContainerStarted","Data":"2059271a7caa2e5ea0f73d8cc7b8e433922b08b9fc53fc2a80887b5e809d92d9"} Dec 06 15:31:28 crc kubenswrapper[4848]: I1206 15:31:28.533676 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cl5sq" podStartSLOduration=2.655745966 podStartE2EDuration="57.533657878s" podCreationTimestamp="2025-12-06 15:30:31 +0000 UTC" firstStartedPulling="2025-12-06 15:30:33.012350981 +0000 UTC m=+100.310361894" lastFinishedPulling="2025-12-06 15:31:27.890262893 +0000 UTC m=+155.188273806" observedRunningTime="2025-12-06 15:31:28.528960553 +0000 UTC m=+155.826971466" watchObservedRunningTime="2025-12-06 15:31:28.533657878 +0000 UTC m=+155.831668791" Dec 06 15:31:29 crc kubenswrapper[4848]: I1206 15:31:29.516588 4848 generic.go:334] "Generic (PLEG): container finished" podID="bc5a426c-df18-43db-b818-0ae2977b7373" containerID="06bfd66e6fc31f8ae4bd4779b672b6ae1f38db168447b52487d8a965d2c43fd5" exitCode=0 Dec 06 15:31:29 crc kubenswrapper[4848]: I1206 15:31:29.516664 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxp7g" event={"ID":"bc5a426c-df18-43db-b818-0ae2977b7373","Type":"ContainerDied","Data":"06bfd66e6fc31f8ae4bd4779b672b6ae1f38db168447b52487d8a965d2c43fd5"} Dec 06 15:31:29 crc kubenswrapper[4848]: I1206 15:31:29.701157 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:31:29 crc kubenswrapper[4848]: I1206 15:31:29.701209 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:31:29 crc kubenswrapper[4848]: I1206 15:31:29.780203 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:31:30 crc kubenswrapper[4848]: I1206 15:31:30.525920 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxp7g" event={"ID":"bc5a426c-df18-43db-b818-0ae2977b7373","Type":"ContainerStarted","Data":"5d256014b2660f6f29495f68e098a9cdc3c684e72afbb5544b45ed2c18e3f02f"} Dec 06 15:31:30 crc kubenswrapper[4848]: I1206 15:31:30.529179 4848 generic.go:334] "Generic (PLEG): container finished" podID="c93d43de-aa47-4357-b786-aa586a35d462" containerID="1e410c4c10617d38176c56ecf0bd3a2d9c329c6a750d932eb5eb157c1fabcdd0" exitCode=0 Dec 06 15:31:30 crc kubenswrapper[4848]: I1206 15:31:30.529261 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgcg8" event={"ID":"c93d43de-aa47-4357-b786-aa586a35d462","Type":"ContainerDied","Data":"1e410c4c10617d38176c56ecf0bd3a2d9c329c6a750d932eb5eb157c1fabcdd0"} Dec 06 15:31:30 crc kubenswrapper[4848]: I1206 15:31:30.534669 4848 generic.go:334] "Generic (PLEG): container finished" podID="7029868d-509f-479f-a237-45715e8114e2" containerID="3a5ef5830117b70533447764d61b0371883b101b3eacfc5d9dee316d3f859df6" exitCode=0 Dec 06 15:31:30 crc kubenswrapper[4848]: I1206 15:31:30.534759 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdjbm" event={"ID":"7029868d-509f-479f-a237-45715e8114e2","Type":"ContainerDied","Data":"3a5ef5830117b70533447764d61b0371883b101b3eacfc5d9dee316d3f859df6"} Dec 06 15:31:30 crc kubenswrapper[4848]: I1206 15:31:30.537500 4848 generic.go:334] "Generic (PLEG): container finished" podID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" containerID="8b410cb77833909364fb2f514e54a5b2582773a7defd5698288742f6b8eadc05" exitCode=0 Dec 06 15:31:30 crc kubenswrapper[4848]: I1206 15:31:30.537994 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzqt" event={"ID":"9aba59c4-d9e7-444b-9620-29a26fa4c9fb","Type":"ContainerDied","Data":"8b410cb77833909364fb2f514e54a5b2582773a7defd5698288742f6b8eadc05"} Dec 06 15:31:30 crc kubenswrapper[4848]: I1206 15:31:30.569985 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qxp7g" podStartSLOduration=2.619027607 podStartE2EDuration="58.569970616s" podCreationTimestamp="2025-12-06 15:30:32 +0000 UTC" firstStartedPulling="2025-12-06 15:30:34.108450947 +0000 UTC m=+101.406461860" lastFinishedPulling="2025-12-06 15:31:30.059393956 +0000 UTC m=+157.357404869" observedRunningTime="2025-12-06 15:31:30.552458213 +0000 UTC m=+157.850469146" watchObservedRunningTime="2025-12-06 15:31:30.569970616 +0000 UTC m=+157.867981519" Dec 06 15:31:31 crc kubenswrapper[4848]: I1206 15:31:31.544265 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzqt" event={"ID":"9aba59c4-d9e7-444b-9620-29a26fa4c9fb","Type":"ContainerStarted","Data":"39c0cf19568dd424cbba8ba20145e648cc341c996340c8e82536a493de414236"} Dec 06 15:31:31 crc kubenswrapper[4848]: I1206 15:31:31.546393 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgcg8" event={"ID":"c93d43de-aa47-4357-b786-aa586a35d462","Type":"ContainerStarted","Data":"1c733c7a7c28a715d0e3df0cfdcf9d42aff38981a19129f6d63437f58e201132"} Dec 06 15:31:31 crc kubenswrapper[4848]: I1206 15:31:31.548339 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdjbm" event={"ID":"7029868d-509f-479f-a237-45715e8114e2","Type":"ContainerStarted","Data":"c770863ec9677c862486bafb90a6d1fe6c0e3220967a74539a731c86abdb6c1f"} Dec 06 15:31:31 crc kubenswrapper[4848]: I1206 15:31:31.582931 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fgcg8" podStartSLOduration=2.4313612989999998 podStartE2EDuration="1m2.582917028s" podCreationTimestamp="2025-12-06 15:30:29 +0000 UTC" firstStartedPulling="2025-12-06 15:30:30.835663537 +0000 UTC m=+98.133674450" lastFinishedPulling="2025-12-06 15:31:30.987219256 +0000 UTC m=+158.285230179" observedRunningTime="2025-12-06 15:31:31.579102527 +0000 UTC m=+158.877113440" watchObservedRunningTime="2025-12-06 15:31:31.582917028 +0000 UTC m=+158.880927941" Dec 06 15:31:31 crc kubenswrapper[4848]: I1206 15:31:31.583532 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lgzqt" podStartSLOduration=3.4350682089999998 podStartE2EDuration="1m3.583526914s" podCreationTimestamp="2025-12-06 15:30:28 +0000 UTC" firstStartedPulling="2025-12-06 15:30:30.844918015 +0000 UTC m=+98.142928918" lastFinishedPulling="2025-12-06 15:31:30.99337669 +0000 UTC m=+158.291387623" observedRunningTime="2025-12-06 15:31:31.561043309 +0000 UTC m=+158.859054222" watchObservedRunningTime="2025-12-06 15:31:31.583526914 +0000 UTC m=+158.881537827" Dec 06 15:31:31 crc kubenswrapper[4848]: I1206 15:31:31.616274 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bdjbm" podStartSLOduration=3.599375854 podStartE2EDuration="1m1.616239749s" podCreationTimestamp="2025-12-06 15:30:30 +0000 UTC" firstStartedPulling="2025-12-06 15:30:33.012865836 +0000 UTC m=+100.310876749" lastFinishedPulling="2025-12-06 15:31:31.029729741 +0000 UTC m=+158.327740644" observedRunningTime="2025-12-06 15:31:31.61361108 +0000 UTC m=+158.911621993" watchObservedRunningTime="2025-12-06 15:31:31.616239749 +0000 UTC m=+158.914250662" Dec 06 15:31:31 crc kubenswrapper[4848]: I1206 15:31:31.708253 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:31:31 crc kubenswrapper[4848]: I1206 15:31:31.708303 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:31:31 crc kubenswrapper[4848]: I1206 15:31:31.744331 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:31:32 crc kubenswrapper[4848]: I1206 15:31:32.555745 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmm2g" event={"ID":"2913f1de-4117-470b-b1de-a876051a131c","Type":"ContainerStarted","Data":"58844abeee291e13fc7a6a929a8f384a223ab709325de815c5d9112958469a72"} Dec 06 15:31:32 crc kubenswrapper[4848]: I1206 15:31:32.558528 4848 generic.go:334] "Generic (PLEG): container finished" podID="a3734c39-129f-4b25-be9d-e2ca36e98de3" containerID="55b90ab8fe210c7d33bcf88cf98df2033048e3df8c6d0fe700980af1ea0e6529" exitCode=0 Dec 06 15:31:32 crc kubenswrapper[4848]: I1206 15:31:32.559184 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4prm4" event={"ID":"a3734c39-129f-4b25-be9d-e2ca36e98de3","Type":"ContainerDied","Data":"55b90ab8fe210c7d33bcf88cf98df2033048e3df8c6d0fe700980af1ea0e6529"} Dec 06 15:31:32 crc kubenswrapper[4848]: I1206 15:31:32.938849 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:31:32 crc kubenswrapper[4848]: I1206 15:31:32.938904 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:31:33 crc kubenswrapper[4848]: I1206 15:31:33.978098 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qxp7g" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" containerName="registry-server" probeResult="failure" output=< Dec 06 15:31:33 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Dec 06 15:31:33 crc kubenswrapper[4848]: > Dec 06 15:31:34 crc kubenswrapper[4848]: I1206 15:31:34.568429 4848 generic.go:334] "Generic (PLEG): container finished" podID="2913f1de-4117-470b-b1de-a876051a131c" containerID="58844abeee291e13fc7a6a929a8f384a223ab709325de815c5d9112958469a72" exitCode=0 Dec 06 15:31:34 crc kubenswrapper[4848]: I1206 15:31:34.568479 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmm2g" event={"ID":"2913f1de-4117-470b-b1de-a876051a131c","Type":"ContainerDied","Data":"58844abeee291e13fc7a6a929a8f384a223ab709325de815c5d9112958469a72"} Dec 06 15:31:39 crc kubenswrapper[4848]: I1206 15:31:39.333359 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:31:39 crc kubenswrapper[4848]: I1206 15:31:39.333930 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:31:39 crc kubenswrapper[4848]: I1206 15:31:39.403355 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:31:39 crc kubenswrapper[4848]: I1206 15:31:39.569911 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:31:39 crc kubenswrapper[4848]: I1206 15:31:39.570132 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:31:39 crc kubenswrapper[4848]: I1206 15:31:39.618258 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:31:39 crc kubenswrapper[4848]: I1206 15:31:39.642602 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:31:39 crc kubenswrapper[4848]: I1206 15:31:39.715286 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:31:39 crc kubenswrapper[4848]: I1206 15:31:39.751831 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:31:40 crc kubenswrapper[4848]: I1206 15:31:40.603847 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4prm4" event={"ID":"a3734c39-129f-4b25-be9d-e2ca36e98de3","Type":"ContainerStarted","Data":"547698fb856285f38fa2f13bab647a1a199611d0eec7e5e56ee21a5dbe468fe7"} Dec 06 15:31:41 crc kubenswrapper[4848]: I1206 15:31:41.348927 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:31:41 crc kubenswrapper[4848]: I1206 15:31:41.348990 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:31:41 crc kubenswrapper[4848]: I1206 15:31:41.416111 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:31:41 crc kubenswrapper[4848]: I1206 15:31:41.446783 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7jzh"] Dec 06 15:31:41 crc kubenswrapper[4848]: I1206 15:31:41.447124 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n7jzh" podUID="b5528992-4741-41f7-bd58-d7614c936639" containerName="registry-server" containerID="cri-o://fc2ffe5b0e907267c47a9f0974d70a07c5b8313088874bdec55e3864c3f69132" gracePeriod=2 Dec 06 15:31:41 crc kubenswrapper[4848]: I1206 15:31:41.635982 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4prm4" podStartSLOduration=3.7333711469999997 podStartE2EDuration="1m12.635937672s" podCreationTimestamp="2025-12-06 15:30:29 +0000 UTC" firstStartedPulling="2025-12-06 15:30:30.830234581 +0000 UTC m=+98.128245494" lastFinishedPulling="2025-12-06 15:31:39.732801066 +0000 UTC m=+167.030812019" observedRunningTime="2025-12-06 15:31:41.625575007 +0000 UTC m=+168.923585920" watchObservedRunningTime="2025-12-06 15:31:41.635937672 +0000 UTC m=+168.933948585" Dec 06 15:31:41 crc kubenswrapper[4848]: I1206 15:31:41.665242 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:31:41 crc kubenswrapper[4848]: I1206 15:31:41.749985 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:31:42 crc kubenswrapper[4848]: I1206 15:31:42.615981 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmm2g" event={"ID":"2913f1de-4117-470b-b1de-a876051a131c","Type":"ContainerStarted","Data":"32d8f945220f7c927c972df3dc87b0e9cc2b757235e3a12596ef4a919599e594"} Dec 06 15:31:42 crc kubenswrapper[4848]: I1206 15:31:42.618729 4848 generic.go:334] "Generic (PLEG): container finished" podID="b5528992-4741-41f7-bd58-d7614c936639" containerID="fc2ffe5b0e907267c47a9f0974d70a07c5b8313088874bdec55e3864c3f69132" exitCode=0 Dec 06 15:31:42 crc kubenswrapper[4848]: I1206 15:31:42.618745 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7jzh" event={"ID":"b5528992-4741-41f7-bd58-d7614c936639","Type":"ContainerDied","Data":"fc2ffe5b0e907267c47a9f0974d70a07c5b8313088874bdec55e3864c3f69132"} Dec 06 15:31:42 crc kubenswrapper[4848]: I1206 15:31:42.983354 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:31:43 crc kubenswrapper[4848]: I1206 15:31:43.021643 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:31:43 crc kubenswrapper[4848]: I1206 15:31:43.839329 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cl5sq"] Dec 06 15:31:43 crc kubenswrapper[4848]: I1206 15:31:43.839591 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cl5sq" podUID="1f66c042-6bdd-4d2d-bd20-09521140274e" containerName="registry-server" containerID="cri-o://2059271a7caa2e5ea0f73d8cc7b8e433922b08b9fc53fc2a80887b5e809d92d9" gracePeriod=2 Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.521881 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.639780 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxpxj\" (UniqueName: \"kubernetes.io/projected/b5528992-4741-41f7-bd58-d7614c936639-kube-api-access-vxpxj\") pod \"b5528992-4741-41f7-bd58-d7614c936639\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.639861 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-utilities\") pod \"b5528992-4741-41f7-bd58-d7614c936639\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.639893 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-catalog-content\") pod \"b5528992-4741-41f7-bd58-d7614c936639\" (UID: \"b5528992-4741-41f7-bd58-d7614c936639\") " Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.641525 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-utilities" (OuterVolumeSpecName: "utilities") pod "b5528992-4741-41f7-bd58-d7614c936639" (UID: "b5528992-4741-41f7-bd58-d7614c936639"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.644011 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7jzh" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.643922 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7jzh" event={"ID":"b5528992-4741-41f7-bd58-d7614c936639","Type":"ContainerDied","Data":"5926ee71ad6fb3c0862cf0841e363df320fff0bd90518eba1687ee56913763ad"} Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.644497 4848 scope.go:117] "RemoveContainer" containerID="fc2ffe5b0e907267c47a9f0974d70a07c5b8313088874bdec55e3864c3f69132" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.657935 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5528992-4741-41f7-bd58-d7614c936639-kube-api-access-vxpxj" (OuterVolumeSpecName: "kube-api-access-vxpxj") pod "b5528992-4741-41f7-bd58-d7614c936639" (UID: "b5528992-4741-41f7-bd58-d7614c936639"). InnerVolumeSpecName "kube-api-access-vxpxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.676541 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hmm2g" podStartSLOduration=6.200446357 podStartE2EDuration="1m13.676516141s" podCreationTimestamp="2025-12-06 15:30:32 +0000 UTC" firstStartedPulling="2025-12-06 15:30:34.091772176 +0000 UTC m=+101.389783089" lastFinishedPulling="2025-12-06 15:31:41.56784196 +0000 UTC m=+168.865852873" observedRunningTime="2025-12-06 15:31:45.670964244 +0000 UTC m=+172.968975157" watchObservedRunningTime="2025-12-06 15:31:45.676516141 +0000 UTC m=+172.974527084" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.683315 4848 scope.go:117] "RemoveContainer" containerID="25f770acbca745f3a7799825f9d0baddab03c1d0c139d7ff75d02facdc94a4de" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.700935 4848 scope.go:117] "RemoveContainer" containerID="8115f8f2caca2dfd06558fd76d395dbd151399c7ffc6417155cda481ce4085df" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.709999 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5528992-4741-41f7-bd58-d7614c936639" (UID: "b5528992-4741-41f7-bd58-d7614c936639"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.740672 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxpxj\" (UniqueName: \"kubernetes.io/projected/b5528992-4741-41f7-bd58-d7614c936639-kube-api-access-vxpxj\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.740789 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.740800 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5528992-4741-41f7-bd58-d7614c936639-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.971084 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7jzh"] Dec 06 15:31:45 crc kubenswrapper[4848]: I1206 15:31:45.974987 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n7jzh"] Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.440463 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxp7g"] Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.440864 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qxp7g" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" containerName="registry-server" containerID="cri-o://5d256014b2660f6f29495f68e098a9cdc3c684e72afbb5544b45ed2c18e3f02f" gracePeriod=2 Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.651192 4848 generic.go:334] "Generic (PLEG): container finished" podID="1f66c042-6bdd-4d2d-bd20-09521140274e" containerID="2059271a7caa2e5ea0f73d8cc7b8e433922b08b9fc53fc2a80887b5e809d92d9" exitCode=0 Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.651561 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cl5sq" event={"ID":"1f66c042-6bdd-4d2d-bd20-09521140274e","Type":"ContainerDied","Data":"2059271a7caa2e5ea0f73d8cc7b8e433922b08b9fc53fc2a80887b5e809d92d9"} Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.742800 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.859860 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbfdq\" (UniqueName: \"kubernetes.io/projected/1f66c042-6bdd-4d2d-bd20-09521140274e-kube-api-access-rbfdq\") pod \"1f66c042-6bdd-4d2d-bd20-09521140274e\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.859916 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-catalog-content\") pod \"1f66c042-6bdd-4d2d-bd20-09521140274e\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.859951 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-utilities\") pod \"1f66c042-6bdd-4d2d-bd20-09521140274e\" (UID: \"1f66c042-6bdd-4d2d-bd20-09521140274e\") " Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.860788 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-utilities" (OuterVolumeSpecName: "utilities") pod "1f66c042-6bdd-4d2d-bd20-09521140274e" (UID: "1f66c042-6bdd-4d2d-bd20-09521140274e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.865975 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f66c042-6bdd-4d2d-bd20-09521140274e-kube-api-access-rbfdq" (OuterVolumeSpecName: "kube-api-access-rbfdq") pod "1f66c042-6bdd-4d2d-bd20-09521140274e" (UID: "1f66c042-6bdd-4d2d-bd20-09521140274e"). InnerVolumeSpecName "kube-api-access-rbfdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.878765 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f66c042-6bdd-4d2d-bd20-09521140274e" (UID: "1f66c042-6bdd-4d2d-bd20-09521140274e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.961478 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbfdq\" (UniqueName: \"kubernetes.io/projected/1f66c042-6bdd-4d2d-bd20-09521140274e-kube-api-access-rbfdq\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.961532 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.961546 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f66c042-6bdd-4d2d-bd20-09521140274e-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:46 crc kubenswrapper[4848]: I1206 15:31:46.975713 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5528992-4741-41f7-bd58-d7614c936639" path="/var/lib/kubelet/pods/b5528992-4741-41f7-bd58-d7614c936639/volumes" Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.150111 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.150184 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.660551 4848 generic.go:334] "Generic (PLEG): container finished" podID="bc5a426c-df18-43db-b818-0ae2977b7373" containerID="5d256014b2660f6f29495f68e098a9cdc3c684e72afbb5544b45ed2c18e3f02f" exitCode=0 Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.660615 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxp7g" event={"ID":"bc5a426c-df18-43db-b818-0ae2977b7373","Type":"ContainerDied","Data":"5d256014b2660f6f29495f68e098a9cdc3c684e72afbb5544b45ed2c18e3f02f"} Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.662606 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cl5sq" event={"ID":"1f66c042-6bdd-4d2d-bd20-09521140274e","Type":"ContainerDied","Data":"61fe1fb27429d09d1225cf1e985145e529e2f0a0f5141c9b492268b74cd257f0"} Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.662639 4848 scope.go:117] "RemoveContainer" containerID="2059271a7caa2e5ea0f73d8cc7b8e433922b08b9fc53fc2a80887b5e809d92d9" Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.662729 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cl5sq" Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.673640 4848 scope.go:117] "RemoveContainer" containerID="a3b56688841b7e2e00b7cf076ea4e0ac35c6408bb0b93cb6a80bc3576d87c76c" Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.676876 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cl5sq"] Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.688957 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cl5sq"] Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.694600 4848 scope.go:117] "RemoveContainer" containerID="a83cacdfb0666d05e094db1d6786f34d4d5f68255230b46cbd0d31b86e0c7410" Dec 06 15:31:47 crc kubenswrapper[4848]: I1206 15:31:47.851134 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4nmrw"] Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.212176 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.291610 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-catalog-content\") pod \"bc5a426c-df18-43db-b818-0ae2977b7373\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.291665 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-utilities\") pod \"bc5a426c-df18-43db-b818-0ae2977b7373\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.291781 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjkpj\" (UniqueName: \"kubernetes.io/projected/bc5a426c-df18-43db-b818-0ae2977b7373-kube-api-access-hjkpj\") pod \"bc5a426c-df18-43db-b818-0ae2977b7373\" (UID: \"bc5a426c-df18-43db-b818-0ae2977b7373\") " Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.292529 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-utilities" (OuterVolumeSpecName: "utilities") pod "bc5a426c-df18-43db-b818-0ae2977b7373" (UID: "bc5a426c-df18-43db-b818-0ae2977b7373"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.295272 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5a426c-df18-43db-b818-0ae2977b7373-kube-api-access-hjkpj" (OuterVolumeSpecName: "kube-api-access-hjkpj") pod "bc5a426c-df18-43db-b818-0ae2977b7373" (UID: "bc5a426c-df18-43db-b818-0ae2977b7373"). InnerVolumeSpecName "kube-api-access-hjkpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.392763 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.392806 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjkpj\" (UniqueName: \"kubernetes.io/projected/bc5a426c-df18-43db-b818-0ae2977b7373-kube-api-access-hjkpj\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.418041 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc5a426c-df18-43db-b818-0ae2977b7373" (UID: "bc5a426c-df18-43db-b818-0ae2977b7373"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.494057 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5a426c-df18-43db-b818-0ae2977b7373-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.669503 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxp7g" event={"ID":"bc5a426c-df18-43db-b818-0ae2977b7373","Type":"ContainerDied","Data":"29c8506ad8a99d2050c3295be0f8e93dcc55d3a726a0ada7bd62becade521185"} Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.669517 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxp7g" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.669818 4848 scope.go:117] "RemoveContainer" containerID="5d256014b2660f6f29495f68e098a9cdc3c684e72afbb5544b45ed2c18e3f02f" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.694757 4848 scope.go:117] "RemoveContainer" containerID="06bfd66e6fc31f8ae4bd4779b672b6ae1f38db168447b52487d8a965d2c43fd5" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.706727 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxp7g"] Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.709869 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qxp7g"] Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.734678 4848 scope.go:117] "RemoveContainer" containerID="7a8b7c52023c21739ead434bb8b7bc43cda985188c0acc74c7ae53e005aebba4" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.974230 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f66c042-6bdd-4d2d-bd20-09521140274e" path="/var/lib/kubelet/pods/1f66c042-6bdd-4d2d-bd20-09521140274e/volumes" Dec 06 15:31:48 crc kubenswrapper[4848]: I1206 15:31:48.974886 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" path="/var/lib/kubelet/pods/bc5a426c-df18-43db-b818-0ae2977b7373/volumes" Dec 06 15:31:49 crc kubenswrapper[4848]: I1206 15:31:49.973024 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:31:49 crc kubenswrapper[4848]: I1206 15:31:49.973382 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:31:50 crc kubenswrapper[4848]: I1206 15:31:50.018013 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:31:50 crc kubenswrapper[4848]: I1206 15:31:50.417669 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 15:31:50 crc kubenswrapper[4848]: I1206 15:31:50.723232 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:31:52 crc kubenswrapper[4848]: I1206 15:31:52.571793 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:31:52 crc kubenswrapper[4848]: I1206 15:31:52.571851 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:31:52 crc kubenswrapper[4848]: I1206 15:31:52.624432 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:31:52 crc kubenswrapper[4848]: I1206 15:31:52.739491 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.237998 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4prm4"] Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.238678 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4prm4" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" containerName="registry-server" containerID="cri-o://547698fb856285f38fa2f13bab647a1a199611d0eec7e5e56ee21a5dbe468fe7" gracePeriod=2 Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.712185 4848 generic.go:334] "Generic (PLEG): container finished" podID="a3734c39-129f-4b25-be9d-e2ca36e98de3" containerID="547698fb856285f38fa2f13bab647a1a199611d0eec7e5e56ee21a5dbe468fe7" exitCode=0 Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.712263 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4prm4" event={"ID":"a3734c39-129f-4b25-be9d-e2ca36e98de3","Type":"ContainerDied","Data":"547698fb856285f38fa2f13bab647a1a199611d0eec7e5e56ee21a5dbe468fe7"} Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.712525 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4prm4" event={"ID":"a3734c39-129f-4b25-be9d-e2ca36e98de3","Type":"ContainerDied","Data":"e8ae6e8e5b44b11ccea05d5416f0bc90bfe63111dc4ce30c6aa2134de618af0d"} Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.712544 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8ae6e8e5b44b11ccea05d5416f0bc90bfe63111dc4ce30c6aa2134de618af0d" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.739274 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805265 4848 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805528 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5528992-4741-41f7-bd58-d7614c936639" containerName="extract-content" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805542 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5528992-4741-41f7-bd58-d7614c936639" containerName="extract-content" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805554 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805563 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805572 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" containerName="extract-utilities" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805579 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" containerName="extract-utilities" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805588 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c531794-18cc-4c23-b373-d6628e2363ff" containerName="pruner" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805594 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c531794-18cc-4c23-b373-d6628e2363ff" containerName="pruner" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805603 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805609 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805618 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5528992-4741-41f7-bd58-d7614c936639" containerName="extract-utilities" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805623 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5528992-4741-41f7-bd58-d7614c936639" containerName="extract-utilities" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805633 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" containerName="extract-utilities" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805640 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" containerName="extract-utilities" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805649 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" containerName="extract-content" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805657 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" containerName="extract-content" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805670 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" containerName="extract-content" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805676 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" containerName="extract-content" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805688 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f66c042-6bdd-4d2d-bd20-09521140274e" containerName="extract-utilities" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805711 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f66c042-6bdd-4d2d-bd20-09521140274e" containerName="extract-utilities" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805723 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f66c042-6bdd-4d2d-bd20-09521140274e" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805730 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f66c042-6bdd-4d2d-bd20-09521140274e" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805739 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5528992-4741-41f7-bd58-d7614c936639" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805747 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5528992-4741-41f7-bd58-d7614c936639" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.805756 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f66c042-6bdd-4d2d-bd20-09521140274e" containerName="extract-content" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805763 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f66c042-6bdd-4d2d-bd20-09521140274e" containerName="extract-content" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805868 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc5a426c-df18-43db-b818-0ae2977b7373" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805887 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c531794-18cc-4c23-b373-d6628e2363ff" containerName="pruner" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805896 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5528992-4741-41f7-bd58-d7614c936639" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805907 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.805917 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f66c042-6bdd-4d2d-bd20-09521140274e" containerName="registry-server" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.806277 4848 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.806429 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.806574 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d" gracePeriod=15 Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.806618 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece" gracePeriod=15 Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.806655 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c" gracePeriod=15 Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.806686 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94" gracePeriod=15 Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.806723 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989" gracePeriod=15 Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807182 4848 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.807328 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807346 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.807358 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807366 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.807378 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807384 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.807395 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807403 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.807410 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807418 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.807428 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807435 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 06 15:31:54 crc kubenswrapper[4848]: E1206 15:31:54.807444 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807454 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807559 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807573 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807584 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807594 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807604 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.807614 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.870581 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-utilities" (OuterVolumeSpecName: "utilities") pod "a3734c39-129f-4b25-be9d-e2ca36e98de3" (UID: "a3734c39-129f-4b25-be9d-e2ca36e98de3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.869506 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-utilities\") pod \"a3734c39-129f-4b25-be9d-e2ca36e98de3\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.872853 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-catalog-content\") pod \"a3734c39-129f-4b25-be9d-e2ca36e98de3\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.872893 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9597\" (UniqueName: \"kubernetes.io/projected/a3734c39-129f-4b25-be9d-e2ca36e98de3-kube-api-access-s9597\") pod \"a3734c39-129f-4b25-be9d-e2ca36e98de3\" (UID: \"a3734c39-129f-4b25-be9d-e2ca36e98de3\") " Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.873060 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.873096 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.873154 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.873178 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.873219 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.873238 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.873308 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.873331 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.873364 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.878076 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3734c39-129f-4b25-be9d-e2ca36e98de3-kube-api-access-s9597" (OuterVolumeSpecName: "kube-api-access-s9597") pod "a3734c39-129f-4b25-be9d-e2ca36e98de3" (UID: "a3734c39-129f-4b25-be9d-e2ca36e98de3"). InnerVolumeSpecName "kube-api-access-s9597". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.926377 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3734c39-129f-4b25-be9d-e2ca36e98de3" (UID: "a3734c39-129f-4b25-be9d-e2ca36e98de3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974309 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974339 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974367 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974381 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974430 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974444 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974465 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974480 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974521 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3734c39-129f-4b25-be9d-e2ca36e98de3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974531 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9597\" (UniqueName: \"kubernetes.io/projected/a3734c39-129f-4b25-be9d-e2ca36e98de3-kube-api-access-s9597\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974566 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974595 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974613 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974632 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.974650 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.975124 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.975311 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:54 crc kubenswrapper[4848]: I1206 15:31:54.975408 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.722646 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.723910 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.724387 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c" exitCode=0 Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.724421 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94" exitCode=0 Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.724429 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece" exitCode=0 Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.724438 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989" exitCode=2 Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.724469 4848 scope.go:117] "RemoveContainer" containerID="b40517cec4d0a40309fad0d84e95f166d7728ae7548c3620b05444c39b1744f9" Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.727967 4848 generic.go:334] "Generic (PLEG): container finished" podID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" containerID="220a4573edc1d3b83b24665ea7ce557af87bc05db4106018e0c60024bb98dc48" exitCode=0 Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.728044 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"88e145b9-64b0-44cc-8eb1-ce5d48791f20","Type":"ContainerDied","Data":"220a4573edc1d3b83b24665ea7ce557af87bc05db4106018e0c60024bb98dc48"} Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.728051 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4prm4" Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.728978 4848 status_manager.go:851] "Failed to get status for pod" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" pod="openshift-marketplace/certified-operators-4prm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4prm4\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.729172 4848 status_manager.go:851] "Failed to get status for pod" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" pod="openshift-marketplace/certified-operators-4prm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4prm4\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.729379 4848 status_manager.go:851] "Failed to get status for pod" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.762059 4848 status_manager.go:851] "Failed to get status for pod" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" pod="openshift-marketplace/certified-operators-4prm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4prm4\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:55 crc kubenswrapper[4848]: I1206 15:31:55.762252 4848 status_manager.go:851] "Failed to get status for pod" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:56 crc kubenswrapper[4848]: I1206 15:31:56.735776 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.049780 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.050890 4848 status_manager.go:851] "Failed to get status for pod" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.051277 4848 status_manager.go:851] "Failed to get status for pod" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" pod="openshift-marketplace/certified-operators-4prm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4prm4\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.158797 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.159762 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.160315 4848 status_manager.go:851] "Failed to get status for pod" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" pod="openshift-marketplace/certified-operators-4prm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4prm4\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.160813 4848 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.161388 4848 status_manager.go:851] "Failed to get status for pod" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.195587 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kube-api-access\") pod \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.195649 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kubelet-dir\") pod \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.195739 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-var-lock\") pod \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\" (UID: \"88e145b9-64b0-44cc-8eb1-ce5d48791f20\") " Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.195802 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "88e145b9-64b0-44cc-8eb1-ce5d48791f20" (UID: "88e145b9-64b0-44cc-8eb1-ce5d48791f20"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.195897 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-var-lock" (OuterVolumeSpecName: "var-lock") pod "88e145b9-64b0-44cc-8eb1-ce5d48791f20" (UID: "88e145b9-64b0-44cc-8eb1-ce5d48791f20"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.195978 4848 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.201241 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "88e145b9-64b0-44cc-8eb1-ce5d48791f20" (UID: "88e145b9-64b0-44cc-8eb1-ce5d48791f20"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.299048 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.299148 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.299183 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.299296 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.299302 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.299350 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.299521 4848 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/88e145b9-64b0-44cc-8eb1-ce5d48791f20-var-lock\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.299535 4848 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.299543 4848 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.299552 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88e145b9-64b0-44cc-8eb1-ce5d48791f20-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.299562 4848 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.519292 4848 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.520261 4848 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.520724 4848 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.521062 4848 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.521459 4848 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.521489 4848 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.521823 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="200ms" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.723422 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="400ms" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.747663 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.748780 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d" exitCode=0 Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.748867 4848 scope.go:117] "RemoveContainer" containerID="548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.748939 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.750764 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"88e145b9-64b0-44cc-8eb1-ce5d48791f20","Type":"ContainerDied","Data":"c3289b7b8b9bac2b471a06d7137da44a0fed6216740b29be72fec57a12b99182"} Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.750802 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3289b7b8b9bac2b471a06d7137da44a0fed6216740b29be72fec57a12b99182" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.750870 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.767332 4848 scope.go:117] "RemoveContainer" containerID="648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.776234 4848 status_manager.go:851] "Failed to get status for pod" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.777211 4848 status_manager.go:851] "Failed to get status for pod" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" pod="openshift-marketplace/certified-operators-4prm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4prm4\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.777637 4848 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.783604 4848 status_manager.go:851] "Failed to get status for pod" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" pod="openshift-marketplace/certified-operators-4prm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4prm4\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.783958 4848 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.784427 4848 status_manager.go:851] "Failed to get status for pod" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.785726 4848 scope.go:117] "RemoveContainer" containerID="5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.804891 4848 scope.go:117] "RemoveContainer" containerID="00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.825322 4848 scope.go:117] "RemoveContainer" containerID="ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.842610 4848 scope.go:117] "RemoveContainer" containerID="ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.864346 4848 scope.go:117] "RemoveContainer" containerID="548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.864834 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\": container with ID starting with 548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c not found: ID does not exist" containerID="548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.864874 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c"} err="failed to get container status \"548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\": rpc error: code = NotFound desc = could not find container \"548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c\": container with ID starting with 548db1462b97aec8f558b050bf69112fadf7a77ec3d30c8f7851e096d231521c not found: ID does not exist" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.864937 4848 scope.go:117] "RemoveContainer" containerID="648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.865917 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\": container with ID starting with 648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94 not found: ID does not exist" containerID="648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.865956 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94"} err="failed to get container status \"648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\": rpc error: code = NotFound desc = could not find container \"648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94\": container with ID starting with 648af40ad827dbf25488475eeeb23d04911f786a32f45454318c2c04c3d68f94 not found: ID does not exist" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.865988 4848 scope.go:117] "RemoveContainer" containerID="5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.866519 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\": container with ID starting with 5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece not found: ID does not exist" containerID="5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.866546 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece"} err="failed to get container status \"5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\": rpc error: code = NotFound desc = could not find container \"5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece\": container with ID starting with 5c2d0ce318228707465844d91401ef5e957f2ad6ba8905bb633b7e092440bece not found: ID does not exist" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.866566 4848 scope.go:117] "RemoveContainer" containerID="00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.866821 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\": container with ID starting with 00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989 not found: ID does not exist" containerID="00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.866843 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989"} err="failed to get container status \"00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\": rpc error: code = NotFound desc = could not find container \"00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989\": container with ID starting with 00b480fd02aef8c4e3cf8b1a8525da122848dd225e01786930b9bed64219d989 not found: ID does not exist" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.866858 4848 scope.go:117] "RemoveContainer" containerID="ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.867096 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\": container with ID starting with ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d not found: ID does not exist" containerID="ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.867121 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d"} err="failed to get container status \"ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\": rpc error: code = NotFound desc = could not find container \"ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d\": container with ID starting with ef616d87d82070f092148f5a58627649d6d8b04c59d4a6a9ad9dedf2ccb30a0d not found: ID does not exist" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.867141 4848 scope.go:117] "RemoveContainer" containerID="ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748" Dec 06 15:31:57 crc kubenswrapper[4848]: E1206 15:31:57.867659 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\": container with ID starting with ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748 not found: ID does not exist" containerID="ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748" Dec 06 15:31:57 crc kubenswrapper[4848]: I1206 15:31:57.867712 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748"} err="failed to get container status \"ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\": rpc error: code = NotFound desc = could not find container \"ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748\": container with ID starting with ea27abf37a9246d7777e5ed849a8ac4db15669e0a3a74db36c63ad366197e748 not found: ID does not exist" Dec 06 15:31:58 crc kubenswrapper[4848]: E1206 15:31:58.124118 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="800ms" Dec 06 15:31:58 crc kubenswrapper[4848]: E1206 15:31:58.925159 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="1.6s" Dec 06 15:31:58 crc kubenswrapper[4848]: I1206 15:31:58.972912 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 06 15:31:59 crc kubenswrapper[4848]: E1206 15:31:59.855121 4848 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.64:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:59 crc kubenswrapper[4848]: I1206 15:31:59.855745 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:31:59 crc kubenswrapper[4848]: E1206 15:31:59.882409 4848 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.64:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187eaa1708a61734 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 15:31:59.881885492 +0000 UTC m=+187.179896415,LastTimestamp:2025-12-06 15:31:59.881885492 +0000 UTC m=+187.179896415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 15:32:00 crc kubenswrapper[4848]: E1206 15:32:00.526413 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="3.2s" Dec 06 15:32:00 crc kubenswrapper[4848]: I1206 15:32:00.768481 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f"} Dec 06 15:32:00 crc kubenswrapper[4848]: I1206 15:32:00.768531 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"784f83db9f5a2aa30b4dc8d2135e932bbba1ed2f6b5119509e94bd0ea124282b"} Dec 06 15:32:00 crc kubenswrapper[4848]: E1206 15:32:00.769152 4848 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.64:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:32:00 crc kubenswrapper[4848]: I1206 15:32:00.769331 4848 status_manager.go:851] "Failed to get status for pod" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:32:00 crc kubenswrapper[4848]: I1206 15:32:00.770053 4848 status_manager.go:851] "Failed to get status for pod" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" pod="openshift-marketplace/certified-operators-4prm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4prm4\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:32:02 crc kubenswrapper[4848]: I1206 15:32:02.970818 4848 status_manager.go:851] "Failed to get status for pod" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:32:02 crc kubenswrapper[4848]: I1206 15:32:02.971781 4848 status_manager.go:851] "Failed to get status for pod" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" pod="openshift-marketplace/certified-operators-4prm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4prm4\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:32:03 crc kubenswrapper[4848]: E1206 15:32:03.060135 4848 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.64:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" volumeName="registry-storage" Dec 06 15:32:03 crc kubenswrapper[4848]: E1206 15:32:03.727081 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="6.4s" Dec 06 15:32:04 crc kubenswrapper[4848]: E1206 15:32:04.379774 4848 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.64:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187eaa1708a61734 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 15:31:59.881885492 +0000 UTC m=+187.179896415,LastTimestamp:2025-12-06 15:31:59.881885492 +0000 UTC m=+187.179896415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 15:32:05 crc kubenswrapper[4848]: I1206 15:32:05.966943 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:32:05 crc kubenswrapper[4848]: I1206 15:32:05.968487 4848 status_manager.go:851] "Failed to get status for pod" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:32:05 crc kubenswrapper[4848]: I1206 15:32:05.969448 4848 status_manager.go:851] "Failed to get status for pod" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" pod="openshift-marketplace/certified-operators-4prm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4prm4\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:32:05 crc kubenswrapper[4848]: I1206 15:32:05.993412 4848 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:05 crc kubenswrapper[4848]: I1206 15:32:05.993464 4848 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:05 crc kubenswrapper[4848]: E1206 15:32:05.993887 4848 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:32:05 crc kubenswrapper[4848]: I1206 15:32:05.994641 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:32:06 crc kubenswrapper[4848]: I1206 15:32:06.809147 4848 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="9fb89951ef7ad6e5e2025e6265ee79b14134e95b652053b2cda27613df4393cb" exitCode=0 Dec 06 15:32:06 crc kubenswrapper[4848]: I1206 15:32:06.809284 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"9fb89951ef7ad6e5e2025e6265ee79b14134e95b652053b2cda27613df4393cb"} Dec 06 15:32:06 crc kubenswrapper[4848]: I1206 15:32:06.809482 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cfbc6c34e513a90e11580d0bcc144f09f3ce2e8263769af239bf8862a5ddae46"} Dec 06 15:32:06 crc kubenswrapper[4848]: I1206 15:32:06.809822 4848 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:06 crc kubenswrapper[4848]: I1206 15:32:06.809838 4848 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:06 crc kubenswrapper[4848]: I1206 15:32:06.810227 4848 status_manager.go:851] "Failed to get status for pod" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:32:06 crc kubenswrapper[4848]: E1206 15:32:06.810302 4848 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:32:06 crc kubenswrapper[4848]: I1206 15:32:06.810464 4848 status_manager.go:851] "Failed to get status for pod" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" pod="openshift-marketplace/certified-operators-4prm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4prm4\": dial tcp 38.102.83.64:6443: connect: connection refused" Dec 06 15:32:07 crc kubenswrapper[4848]: I1206 15:32:07.819435 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 15:32:07 crc kubenswrapper[4848]: I1206 15:32:07.819756 4848 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0" exitCode=1 Dec 06 15:32:07 crc kubenswrapper[4848]: I1206 15:32:07.819850 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0"} Dec 06 15:32:07 crc kubenswrapper[4848]: I1206 15:32:07.820387 4848 scope.go:117] "RemoveContainer" containerID="410573f33c267f6cbd5c2799a41e66690797fb272ffc70c31d90e4e9d77eaec0" Dec 06 15:32:07 crc kubenswrapper[4848]: I1206 15:32:07.831083 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"980537c112ec258774583423fae40d5c9efc496211d28201a72fd8d2ad8ad452"} Dec 06 15:32:07 crc kubenswrapper[4848]: I1206 15:32:07.831129 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7758b87de4dd32ab2cd80b3e42a6c5c4636702d1ea29e725ae85bc385c767875"} Dec 06 15:32:07 crc kubenswrapper[4848]: I1206 15:32:07.831149 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7c5e459f46d47b0e2b8a0c3630ac33157ca6e5ea6e964ca8923b1f8be571bc8f"} Dec 06 15:32:07 crc kubenswrapper[4848]: I1206 15:32:07.831162 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0110bc4d4ce67ba260dc229a0dd17e47b84891afa14958f92b61fcf341ffb2ef"} Dec 06 15:32:08 crc kubenswrapper[4848]: I1206 15:32:08.839913 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"420c8ab635ede4cb121073c31e64ab06343086dc5cf127e512c65cfffc3267a6"} Dec 06 15:32:08 crc kubenswrapper[4848]: I1206 15:32:08.840041 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:32:08 crc kubenswrapper[4848]: I1206 15:32:08.840117 4848 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:08 crc kubenswrapper[4848]: I1206 15:32:08.840132 4848 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:08 crc kubenswrapper[4848]: I1206 15:32:08.843905 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 15:32:08 crc kubenswrapper[4848]: I1206 15:32:08.843954 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fd7c73cb2904e1952826b991b3e5efd47a328897aa23a8bac9df643eb0086d39"} Dec 06 15:32:10 crc kubenswrapper[4848]: I1206 15:32:10.995565 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:32:10 crc kubenswrapper[4848]: I1206 15:32:10.997463 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:32:11 crc kubenswrapper[4848]: I1206 15:32:11.003741 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:32:12 crc kubenswrapper[4848]: I1206 15:32:12.683869 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:32:12 crc kubenswrapper[4848]: I1206 15:32:12.893220 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" podUID="e5fee749-2cc4-41ea-9b22-7499624ae892" containerName="oauth-openshift" containerID="cri-o://2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840" gracePeriod=15 Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.242270 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.353624 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-serving-cert\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.353725 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-ocp-branding-template\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.353782 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-service-ca\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.353826 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwg85\" (UniqueName: \"kubernetes.io/projected/e5fee749-2cc4-41ea-9b22-7499624ae892-kube-api-access-xwg85\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.353872 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-policies\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.353911 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-router-certs\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.353960 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-trusted-ca-bundle\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.354000 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-login\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.354032 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-error\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.354072 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-provider-selection\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.354111 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-cliconfig\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.354158 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-idp-0-file-data\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.354190 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-session\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.354253 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-dir\") pod \"e5fee749-2cc4-41ea-9b22-7499624ae892\" (UID: \"e5fee749-2cc4-41ea-9b22-7499624ae892\") " Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.354607 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.355181 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.356644 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.356719 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.356890 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.359941 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.365547 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5fee749-2cc4-41ea-9b22-7499624ae892-kube-api-access-xwg85" (OuterVolumeSpecName: "kube-api-access-xwg85") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "kube-api-access-xwg85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.367218 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.367378 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.369846 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.371625 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.372416 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.372741 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.383011 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e5fee749-2cc4-41ea-9b22-7499624ae892" (UID: "e5fee749-2cc4-41ea-9b22-7499624ae892"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455236 4848 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455270 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455284 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455294 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455305 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwg85\" (UniqueName: \"kubernetes.io/projected/e5fee749-2cc4-41ea-9b22-7499624ae892-kube-api-access-xwg85\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455315 4848 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455323 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455332 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455341 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455349 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455358 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455367 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455375 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.455384 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e5fee749-2cc4-41ea-9b22-7499624ae892-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.848494 4848 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.879247 4848 generic.go:334] "Generic (PLEG): container finished" podID="e5fee749-2cc4-41ea-9b22-7499624ae892" containerID="2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840" exitCode=0 Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.879570 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" event={"ID":"e5fee749-2cc4-41ea-9b22-7499624ae892","Type":"ContainerDied","Data":"2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840"} Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.879602 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" event={"ID":"e5fee749-2cc4-41ea-9b22-7499624ae892","Type":"ContainerDied","Data":"9ece09f732494eeedac8a44356bbfc0fe897e07ca6dc5c1ea560a60ce3c9a57f"} Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.879619 4848 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.879634 4848 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.879663 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4nmrw" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.879622 4848 scope.go:117] "RemoveContainer" containerID="2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.887188 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.889607 4848 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8983dbcd-fdcc-47ad-ad94-d99957ddca4c" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.897607 4848 scope.go:117] "RemoveContainer" containerID="2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840" Dec 06 15:32:13 crc kubenswrapper[4848]: E1206 15:32:13.898159 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840\": container with ID starting with 2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840 not found: ID does not exist" containerID="2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840" Dec 06 15:32:13 crc kubenswrapper[4848]: I1206 15:32:13.898296 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840"} err="failed to get container status \"2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840\": rpc error: code = NotFound desc = could not find container \"2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840\": container with ID starting with 2e588e7b6787e2bdc1825a4614c25c8d55247de739da0ed65f0f7930a4954840 not found: ID does not exist" Dec 06 15:32:14 crc kubenswrapper[4848]: I1206 15:32:14.885400 4848 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:14 crc kubenswrapper[4848]: I1206 15:32:14.886058 4848 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:15 crc kubenswrapper[4848]: I1206 15:32:15.960056 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:32:15 crc kubenswrapper[4848]: I1206 15:32:15.964172 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:32:17 crc kubenswrapper[4848]: I1206 15:32:17.150292 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:32:17 crc kubenswrapper[4848]: I1206 15:32:17.150648 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:32:17 crc kubenswrapper[4848]: I1206 15:32:17.150775 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:32:17 crc kubenswrapper[4848]: I1206 15:32:17.151740 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff"} pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 15:32:17 crc kubenswrapper[4848]: I1206 15:32:17.151858 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" containerID="cri-o://0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff" gracePeriod=600 Dec 06 15:32:17 crc kubenswrapper[4848]: I1206 15:32:17.912094 4848 generic.go:334] "Generic (PLEG): container finished" podID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerID="0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff" exitCode=0 Dec 06 15:32:17 crc kubenswrapper[4848]: I1206 15:32:17.912164 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerDied","Data":"0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff"} Dec 06 15:32:17 crc kubenswrapper[4848]: I1206 15:32:17.912582 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"0c99f1328e0668dc9b260317ced6462d308dfaafbbba65376203fa8ba91f7d72"} Dec 06 15:32:20 crc kubenswrapper[4848]: I1206 15:32:20.249375 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 15:32:21 crc kubenswrapper[4848]: I1206 15:32:21.200507 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 06 15:32:22 crc kubenswrapper[4848]: I1206 15:32:22.690325 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 15:32:22 crc kubenswrapper[4848]: I1206 15:32:22.717027 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 15:32:22 crc kubenswrapper[4848]: I1206 15:32:22.976299 4848 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8983dbcd-fdcc-47ad-ad94-d99957ddca4c" Dec 06 15:32:23 crc kubenswrapper[4848]: I1206 15:32:23.005214 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 06 15:32:23 crc kubenswrapper[4848]: I1206 15:32:23.206463 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 06 15:32:23 crc kubenswrapper[4848]: I1206 15:32:23.577524 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 06 15:32:23 crc kubenswrapper[4848]: I1206 15:32:23.592628 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 06 15:32:24 crc kubenswrapper[4848]: I1206 15:32:24.009333 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 15:32:24 crc kubenswrapper[4848]: I1206 15:32:24.391211 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 15:32:24 crc kubenswrapper[4848]: I1206 15:32:24.611869 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 06 15:32:24 crc kubenswrapper[4848]: I1206 15:32:24.922389 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 06 15:32:24 crc kubenswrapper[4848]: I1206 15:32:24.990689 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 06 15:32:25 crc kubenswrapper[4848]: I1206 15:32:25.485090 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 06 15:32:25 crc kubenswrapper[4848]: I1206 15:32:25.934952 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 06 15:32:25 crc kubenswrapper[4848]: I1206 15:32:25.959917 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 06 15:32:26 crc kubenswrapper[4848]: I1206 15:32:26.257507 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 06 15:32:26 crc kubenswrapper[4848]: I1206 15:32:26.322915 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 15:32:26 crc kubenswrapper[4848]: I1206 15:32:26.389410 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 06 15:32:26 crc kubenswrapper[4848]: I1206 15:32:26.731078 4848 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 06 15:32:26 crc kubenswrapper[4848]: I1206 15:32:26.742523 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 06 15:32:27 crc kubenswrapper[4848]: I1206 15:32:27.010103 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 15:32:27 crc kubenswrapper[4848]: I1206 15:32:27.169903 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 06 15:32:27 crc kubenswrapper[4848]: I1206 15:32:27.187423 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 06 15:32:27 crc kubenswrapper[4848]: I1206 15:32:27.273513 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 06 15:32:27 crc kubenswrapper[4848]: I1206 15:32:27.340979 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 06 15:32:27 crc kubenswrapper[4848]: I1206 15:32:27.443502 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 06 15:32:27 crc kubenswrapper[4848]: I1206 15:32:27.481652 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 06 15:32:27 crc kubenswrapper[4848]: I1206 15:32:27.616229 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 06 15:32:27 crc kubenswrapper[4848]: I1206 15:32:27.834288 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 06 15:32:28 crc kubenswrapper[4848]: I1206 15:32:28.021008 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 06 15:32:28 crc kubenswrapper[4848]: I1206 15:32:28.092212 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 06 15:32:28 crc kubenswrapper[4848]: I1206 15:32:28.092782 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 06 15:32:28 crc kubenswrapper[4848]: I1206 15:32:28.100483 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 06 15:32:28 crc kubenswrapper[4848]: I1206 15:32:28.163540 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 06 15:32:28 crc kubenswrapper[4848]: I1206 15:32:28.363942 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 06 15:32:28 crc kubenswrapper[4848]: I1206 15:32:28.391933 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 06 15:32:28 crc kubenswrapper[4848]: I1206 15:32:28.640337 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.068493 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.093060 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.120563 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.175103 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.186208 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.233757 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.261141 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.348498 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.363667 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.514114 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.773775 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.782177 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 06 15:32:29 crc kubenswrapper[4848]: I1206 15:32:29.932151 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.037765 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.116458 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.132624 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.244526 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.272283 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.339379 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.360955 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.515885 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.544626 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.555351 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.605490 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.676057 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 06 15:32:30 crc kubenswrapper[4848]: I1206 15:32:30.739362 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.126212 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.159528 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.189205 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.284189 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.363496 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.374184 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.385937 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.387491 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.399872 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.479023 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.511453 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.693522 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.695110 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.738317 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.754565 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.773288 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.946571 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.957855 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 06 15:32:31 crc kubenswrapper[4848]: I1206 15:32:31.974725 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.021670 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.053020 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.060298 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.280515 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.506780 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.637506 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.638212 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.693324 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.774566 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.904670 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.915801 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 06 15:32:32 crc kubenswrapper[4848]: I1206 15:32:32.931192 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.002451 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.014415 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.088110 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.160545 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.245561 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.368127 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.434210 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.475964 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.495605 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.503613 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.550989 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.598908 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.604622 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.745152 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.799984 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.901255 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.922354 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 15:32:33 crc kubenswrapper[4848]: I1206 15:32:33.960338 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.133137 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.172839 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.255760 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.283178 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.323956 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.360075 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.391324 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.417340 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.421909 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.429326 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.506204 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.516089 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.519887 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.521462 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.661109 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.671197 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.725005 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.765317 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.769433 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.781930 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.802471 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.813131 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.901568 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 15:32:34 crc kubenswrapper[4848]: I1206 15:32:34.919489 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.018229 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.023426 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.048661 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.109185 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.129360 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.133775 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.140896 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.289860 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.297641 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.544449 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.604144 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.608066 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.699311 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.709808 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.739439 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.745123 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.805043 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.860524 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.877402 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 06 15:32:35 crc kubenswrapper[4848]: I1206 15:32:35.918775 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.129553 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.239786 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.255974 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.300691 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.305919 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.323015 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.345578 4848 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.362270 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.371073 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.547589 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.560077 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.609026 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.863238 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.893439 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.923134 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 06 15:32:36 crc kubenswrapper[4848]: I1206 15:32:36.962367 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.102169 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.155479 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.160286 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.208476 4848 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.257061 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.318792 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.346197 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.406396 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.429743 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.462899 4848 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.567932 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.604600 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.659461 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.665617 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.689135 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.775568 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.797232 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.845682 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.918328 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.922181 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 06 15:32:37 crc kubenswrapper[4848]: I1206 15:32:37.944100 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.018179 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.188806 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.188947 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.284026 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.284057 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.287871 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.300266 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.300505 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.317033 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.505095 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.733262 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.764212 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.868246 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.925180 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.942575 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.958219 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 06 15:32:38 crc kubenswrapper[4848]: I1206 15:32:38.965446 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.177498 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.212503 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.254767 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.308078 4848 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.316215 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-4nmrw","openshift-marketplace/certified-operators-4prm4"] Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.316303 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6785554d74-g9vps","openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 15:32:39 crc kubenswrapper[4848]: E1206 15:32:39.316547 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fee749-2cc4-41ea-9b22-7499624ae892" containerName="oauth-openshift" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.316566 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fee749-2cc4-41ea-9b22-7499624ae892" containerName="oauth-openshift" Dec 06 15:32:39 crc kubenswrapper[4848]: E1206 15:32:39.316590 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" containerName="installer" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.316604 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" containerName="installer" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.316829 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e145b9-64b0-44cc-8eb1-ce5d48791f20" containerName="installer" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.316846 4848 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.316868 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5fee749-2cc4-41ea-9b22-7499624ae892" containerName="oauth-openshift" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.316879 4848 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4088dff3-91f6-41f3-afad-2b6bc1cefe21" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.317998 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.319859 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.320528 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.320787 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.321290 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.321489 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.321580 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.321682 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.321891 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.323018 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.323098 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.323760 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.324480 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.328349 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.336049 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.337065 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.348682 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.398103 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=26.398086504 podStartE2EDuration="26.398086504s" podCreationTimestamp="2025-12-06 15:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:32:39.386923937 +0000 UTC m=+226.684934870" watchObservedRunningTime="2025-12-06 15:32:39.398086504 +0000 UTC m=+226.696097417" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.473625 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.492915 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.515141 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.515811 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-audit-policies\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.515982 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-session\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.516158 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-service-ca\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.516302 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.516465 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqkck\" (UniqueName: \"kubernetes.io/projected/dbbca775-e816-4826-b09c-cb749013832d-kube-api-access-nqkck\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.516622 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.516817 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dbbca775-e816-4826-b09c-cb749013832d-audit-dir\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.516995 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.517178 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-template-error\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.517328 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-router-certs\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.517491 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-template-login\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.517635 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.517790 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.618027 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.619512 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-template-login\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.619940 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.621014 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.621373 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.621741 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-audit-policies\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.622061 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-session\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.622270 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-service-ca\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.622495 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.622873 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqkck\" (UniqueName: \"kubernetes.io/projected/dbbca775-e816-4826-b09c-cb749013832d-kube-api-access-nqkck\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.623119 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.622964 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-service-ca\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.623153 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-audit-policies\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.623444 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dbbca775-e816-4826-b09c-cb749013832d-audit-dir\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.623611 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.623642 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dbbca775-e816-4826-b09c-cb749013832d-audit-dir\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.623880 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.623966 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-template-error\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.624046 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-router-certs\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.628193 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-session\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.628458 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-router-certs\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.629292 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-template-error\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.629457 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.629776 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-template-login\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.629922 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.631931 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.634272 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.637240 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dbbca775-e816-4826-b09c-cb749013832d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.644866 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqkck\" (UniqueName: \"kubernetes.io/projected/dbbca775-e816-4826-b09c-cb749013832d-kube-api-access-nqkck\") pod \"oauth-openshift-6785554d74-g9vps\" (UID: \"dbbca775-e816-4826-b09c-cb749013832d\") " pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.658832 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.731036 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.741494 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 06 15:32:39 crc kubenswrapper[4848]: I1206 15:32:39.883325 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.065021 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6785554d74-g9vps"] Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.160250 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.245092 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.394437 4848 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.428156 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.484831 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.505362 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.509235 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.706006 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.973388 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3734c39-129f-4b25-be9d-e2ca36e98de3" path="/var/lib/kubelet/pods/a3734c39-129f-4b25-be9d-e2ca36e98de3/volumes" Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.974517 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5fee749-2cc4-41ea-9b22-7499624ae892" path="/var/lib/kubelet/pods/e5fee749-2cc4-41ea-9b22-7499624ae892/volumes" Dec 06 15:32:40 crc kubenswrapper[4848]: I1206 15:32:40.994124 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 06 15:32:41 crc kubenswrapper[4848]: I1206 15:32:41.050564 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6785554d74-g9vps_dbbca775-e816-4826-b09c-cb749013832d/oauth-openshift/0.log" Dec 06 15:32:41 crc kubenswrapper[4848]: I1206 15:32:41.050622 4848 generic.go:334] "Generic (PLEG): container finished" podID="dbbca775-e816-4826-b09c-cb749013832d" containerID="48996d59fc9efdd17b1cca74e307b89d185cae23db4576ad922c59cce7f1e10e" exitCode=255 Dec 06 15:32:41 crc kubenswrapper[4848]: I1206 15:32:41.050751 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" event={"ID":"dbbca775-e816-4826-b09c-cb749013832d","Type":"ContainerDied","Data":"48996d59fc9efdd17b1cca74e307b89d185cae23db4576ad922c59cce7f1e10e"} Dec 06 15:32:41 crc kubenswrapper[4848]: I1206 15:32:41.050812 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" event={"ID":"dbbca775-e816-4826-b09c-cb749013832d","Type":"ContainerStarted","Data":"2733cf19a2cb425354d97758d5cfe5ef5e1c2b9c6bd671b3c7f616b97a264186"} Dec 06 15:32:41 crc kubenswrapper[4848]: I1206 15:32:41.051121 4848 scope.go:117] "RemoveContainer" containerID="48996d59fc9efdd17b1cca74e307b89d185cae23db4576ad922c59cce7f1e10e" Dec 06 15:32:41 crc kubenswrapper[4848]: I1206 15:32:41.216317 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 06 15:32:41 crc kubenswrapper[4848]: I1206 15:32:41.803245 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 06 15:32:42 crc kubenswrapper[4848]: I1206 15:32:42.046529 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 06 15:32:42 crc kubenswrapper[4848]: I1206 15:32:42.060431 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6785554d74-g9vps_dbbca775-e816-4826-b09c-cb749013832d/oauth-openshift/1.log" Dec 06 15:32:42 crc kubenswrapper[4848]: I1206 15:32:42.061056 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6785554d74-g9vps_dbbca775-e816-4826-b09c-cb749013832d/oauth-openshift/0.log" Dec 06 15:32:42 crc kubenswrapper[4848]: I1206 15:32:42.061124 4848 generic.go:334] "Generic (PLEG): container finished" podID="dbbca775-e816-4826-b09c-cb749013832d" containerID="fbd0d32f4a26ceb931f0073398b84f16964eb4d942b1b89c20f238e7b608502c" exitCode=255 Dec 06 15:32:42 crc kubenswrapper[4848]: I1206 15:32:42.061160 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" event={"ID":"dbbca775-e816-4826-b09c-cb749013832d","Type":"ContainerDied","Data":"fbd0d32f4a26ceb931f0073398b84f16964eb4d942b1b89c20f238e7b608502c"} Dec 06 15:32:42 crc kubenswrapper[4848]: I1206 15:32:42.061211 4848 scope.go:117] "RemoveContainer" containerID="48996d59fc9efdd17b1cca74e307b89d185cae23db4576ad922c59cce7f1e10e" Dec 06 15:32:42 crc kubenswrapper[4848]: I1206 15:32:42.061885 4848 scope.go:117] "RemoveContainer" containerID="fbd0d32f4a26ceb931f0073398b84f16964eb4d942b1b89c20f238e7b608502c" Dec 06 15:32:42 crc kubenswrapper[4848]: E1206 15:32:42.062210 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-6785554d74-g9vps_openshift-authentication(dbbca775-e816-4826-b09c-cb749013832d)\"" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" podUID="dbbca775-e816-4826-b09c-cb749013832d" Dec 06 15:32:42 crc kubenswrapper[4848]: I1206 15:32:42.553634 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 06 15:32:42 crc kubenswrapper[4848]: I1206 15:32:42.679833 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 06 15:32:42 crc kubenswrapper[4848]: I1206 15:32:42.693332 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 06 15:32:43 crc kubenswrapper[4848]: I1206 15:32:43.069169 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6785554d74-g9vps_dbbca775-e816-4826-b09c-cb749013832d/oauth-openshift/1.log" Dec 06 15:32:43 crc kubenswrapper[4848]: I1206 15:32:43.071688 4848 scope.go:117] "RemoveContainer" containerID="fbd0d32f4a26ceb931f0073398b84f16964eb4d942b1b89c20f238e7b608502c" Dec 06 15:32:43 crc kubenswrapper[4848]: E1206 15:32:43.072016 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-6785554d74-g9vps_openshift-authentication(dbbca775-e816-4826-b09c-cb749013832d)\"" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" podUID="dbbca775-e816-4826-b09c-cb749013832d" Dec 06 15:32:43 crc kubenswrapper[4848]: I1206 15:32:43.188593 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 06 15:32:44 crc kubenswrapper[4848]: I1206 15:32:44.111347 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 06 15:32:47 crc kubenswrapper[4848]: I1206 15:32:47.394530 4848 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 15:32:47 crc kubenswrapper[4848]: I1206 15:32:47.395151 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f" gracePeriod=5 Dec 06 15:32:49 crc kubenswrapper[4848]: I1206 15:32:49.659153 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:49 crc kubenswrapper[4848]: I1206 15:32:49.659241 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:32:49 crc kubenswrapper[4848]: I1206 15:32:49.660052 4848 scope.go:117] "RemoveContainer" containerID="fbd0d32f4a26ceb931f0073398b84f16964eb4d942b1b89c20f238e7b608502c" Dec 06 15:32:49 crc kubenswrapper[4848]: E1206 15:32:49.660459 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-6785554d74-g9vps_openshift-authentication(dbbca775-e816-4826-b09c-cb749013832d)\"" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" podUID="dbbca775-e816-4826-b09c-cb749013832d" Dec 06 15:32:52 crc kubenswrapper[4848]: I1206 15:32:52.969953 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 15:32:52 crc kubenswrapper[4848]: I1206 15:32:52.970254 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.000552 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.000597 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.000673 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.000780 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.000809 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.000821 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.000875 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.000962 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.001641 4848 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.001658 4848 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.001667 4848 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.000668 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.007425 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.102622 4848 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.102665 4848 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.135435 4848 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f" exitCode=137 Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.135491 4848 scope.go:117] "RemoveContainer" containerID="649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.135576 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.153052 4848 scope.go:117] "RemoveContainer" containerID="649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f" Dec 06 15:32:53 crc kubenswrapper[4848]: E1206 15:32:53.153475 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f\": container with ID starting with 649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f not found: ID does not exist" containerID="649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f" Dec 06 15:32:53 crc kubenswrapper[4848]: I1206 15:32:53.153506 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f"} err="failed to get container status \"649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f\": rpc error: code = NotFound desc = could not find container \"649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f\": container with ID starting with 649c5fe94ebf602f17284889a1f043ee018b74b20ceeb77eec47453062c4bf0f not found: ID does not exist" Dec 06 15:32:54 crc kubenswrapper[4848]: I1206 15:32:54.978516 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 06 15:32:59 crc kubenswrapper[4848]: I1206 15:32:59.176163 4848 generic.go:334] "Generic (PLEG): container finished" podID="6056241e-bb6d-420b-9808-b9b3803a3c2d" containerID="4076a09d59ac0d5d7055a84db2c7974c7145f95532fbe10b74659660bf761617" exitCode=0 Dec 06 15:32:59 crc kubenswrapper[4848]: I1206 15:32:59.176311 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" event={"ID":"6056241e-bb6d-420b-9808-b9b3803a3c2d","Type":"ContainerDied","Data":"4076a09d59ac0d5d7055a84db2c7974c7145f95532fbe10b74659660bf761617"} Dec 06 15:32:59 crc kubenswrapper[4848]: I1206 15:32:59.176748 4848 scope.go:117] "RemoveContainer" containerID="4076a09d59ac0d5d7055a84db2c7974c7145f95532fbe10b74659660bf761617" Dec 06 15:33:00 crc kubenswrapper[4848]: I1206 15:33:00.184030 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" event={"ID":"6056241e-bb6d-420b-9808-b9b3803a3c2d","Type":"ContainerStarted","Data":"e307124f47fd7190b563ddefe4afea84fe09bd8ae53bc2e0e4e7095c3b82acab"} Dec 06 15:33:00 crc kubenswrapper[4848]: I1206 15:33:00.184666 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:33:00 crc kubenswrapper[4848]: I1206 15:33:00.191129 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:33:01 crc kubenswrapper[4848]: I1206 15:33:01.966742 4848 scope.go:117] "RemoveContainer" containerID="fbd0d32f4a26ceb931f0073398b84f16964eb4d942b1b89c20f238e7b608502c" Dec 06 15:33:03 crc kubenswrapper[4848]: I1206 15:33:03.201341 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6785554d74-g9vps_dbbca775-e816-4826-b09c-cb749013832d/oauth-openshift/1.log" Dec 06 15:33:03 crc kubenswrapper[4848]: I1206 15:33:03.202028 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" event={"ID":"dbbca775-e816-4826-b09c-cb749013832d","Type":"ContainerStarted","Data":"ab2074fffd61fd0934b971d5b7d69298149bd94e009ed8424aaf6736921d9ca8"} Dec 06 15:33:03 crc kubenswrapper[4848]: I1206 15:33:03.202489 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:33:03 crc kubenswrapper[4848]: I1206 15:33:03.208688 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" Dec 06 15:33:03 crc kubenswrapper[4848]: I1206 15:33:03.234921 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6785554d74-g9vps" podStartSLOduration=76.234900892 podStartE2EDuration="1m16.234900892s" podCreationTimestamp="2025-12-06 15:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:33:03.232023024 +0000 UTC m=+250.530033947" watchObservedRunningTime="2025-12-06 15:33:03.234900892 +0000 UTC m=+250.532911805" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.140129 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8jq72"] Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.140890 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" podUID="b5de67fc-ba65-4752-b10a-86149771384a" containerName="controller-manager" containerID="cri-o://9ca702a505344b994f1d4cf8b806dcf50e7333e1796c4fce986c80b65af57e18" gracePeriod=30 Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.239847 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s"] Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.240086 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" podUID="b1d6d035-9129-459e-b913-95a87f868196" containerName="route-controller-manager" containerID="cri-o://140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f" gracePeriod=30 Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.313045 4848 generic.go:334] "Generic (PLEG): container finished" podID="b5de67fc-ba65-4752-b10a-86149771384a" containerID="9ca702a505344b994f1d4cf8b806dcf50e7333e1796c4fce986c80b65af57e18" exitCode=0 Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.313127 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" event={"ID":"b5de67fc-ba65-4752-b10a-86149771384a","Type":"ContainerDied","Data":"9ca702a505344b994f1d4cf8b806dcf50e7333e1796c4fce986c80b65af57e18"} Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.467087 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.504937 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-client-ca\") pod \"b5de67fc-ba65-4752-b10a-86149771384a\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.504988 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-proxy-ca-bundles\") pod \"b5de67fc-ba65-4752-b10a-86149771384a\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.505018 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-config\") pod \"b5de67fc-ba65-4752-b10a-86149771384a\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.505049 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg7g7\" (UniqueName: \"kubernetes.io/projected/b5de67fc-ba65-4752-b10a-86149771384a-kube-api-access-hg7g7\") pod \"b5de67fc-ba65-4752-b10a-86149771384a\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.505100 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5de67fc-ba65-4752-b10a-86149771384a-serving-cert\") pod \"b5de67fc-ba65-4752-b10a-86149771384a\" (UID: \"b5de67fc-ba65-4752-b10a-86149771384a\") " Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.505604 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-client-ca" (OuterVolumeSpecName: "client-ca") pod "b5de67fc-ba65-4752-b10a-86149771384a" (UID: "b5de67fc-ba65-4752-b10a-86149771384a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.506124 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-config" (OuterVolumeSpecName: "config") pod "b5de67fc-ba65-4752-b10a-86149771384a" (UID: "b5de67fc-ba65-4752-b10a-86149771384a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.506359 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b5de67fc-ba65-4752-b10a-86149771384a" (UID: "b5de67fc-ba65-4752-b10a-86149771384a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.525920 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5de67fc-ba65-4752-b10a-86149771384a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b5de67fc-ba65-4752-b10a-86149771384a" (UID: "b5de67fc-ba65-4752-b10a-86149771384a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.527007 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5de67fc-ba65-4752-b10a-86149771384a-kube-api-access-hg7g7" (OuterVolumeSpecName: "kube-api-access-hg7g7") pod "b5de67fc-ba65-4752-b10a-86149771384a" (UID: "b5de67fc-ba65-4752-b10a-86149771384a"). InnerVolumeSpecName "kube-api-access-hg7g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.556083 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.606647 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-client-ca\") pod \"b1d6d035-9129-459e-b913-95a87f868196\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.606692 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-config\") pod \"b1d6d035-9129-459e-b913-95a87f868196\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.606787 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1d6d035-9129-459e-b913-95a87f868196-serving-cert\") pod \"b1d6d035-9129-459e-b913-95a87f868196\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.606848 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhh7n\" (UniqueName: \"kubernetes.io/projected/b1d6d035-9129-459e-b913-95a87f868196-kube-api-access-vhh7n\") pod \"b1d6d035-9129-459e-b913-95a87f868196\" (UID: \"b1d6d035-9129-459e-b913-95a87f868196\") " Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.607095 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.607113 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.607127 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5de67fc-ba65-4752-b10a-86149771384a-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.607138 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg7g7\" (UniqueName: \"kubernetes.io/projected/b5de67fc-ba65-4752-b10a-86149771384a-kube-api-access-hg7g7\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.607149 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5de67fc-ba65-4752-b10a-86149771384a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.607387 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-client-ca" (OuterVolumeSpecName: "client-ca") pod "b1d6d035-9129-459e-b913-95a87f868196" (UID: "b1d6d035-9129-459e-b913-95a87f868196"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.607408 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-config" (OuterVolumeSpecName: "config") pod "b1d6d035-9129-459e-b913-95a87f868196" (UID: "b1d6d035-9129-459e-b913-95a87f868196"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.609961 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d6d035-9129-459e-b913-95a87f868196-kube-api-access-vhh7n" (OuterVolumeSpecName: "kube-api-access-vhh7n") pod "b1d6d035-9129-459e-b913-95a87f868196" (UID: "b1d6d035-9129-459e-b913-95a87f868196"). InnerVolumeSpecName "kube-api-access-vhh7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.609987 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d6d035-9129-459e-b913-95a87f868196-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b1d6d035-9129-459e-b913-95a87f868196" (UID: "b1d6d035-9129-459e-b913-95a87f868196"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.707965 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhh7n\" (UniqueName: \"kubernetes.io/projected/b1d6d035-9129-459e-b913-95a87f868196-kube-api-access-vhh7n\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.708032 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.708044 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d6d035-9129-459e-b913-95a87f868196-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:24 crc kubenswrapper[4848]: I1206 15:33:24.708053 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1d6d035-9129-459e-b913-95a87f868196-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.177256 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-895789bdc-vlc5p"] Dec 06 15:33:25 crc kubenswrapper[4848]: E1206 15:33:25.177439 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5de67fc-ba65-4752-b10a-86149771384a" containerName="controller-manager" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.177450 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5de67fc-ba65-4752-b10a-86149771384a" containerName="controller-manager" Dec 06 15:33:25 crc kubenswrapper[4848]: E1206 15:33:25.177466 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.177472 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 15:33:25 crc kubenswrapper[4848]: E1206 15:33:25.177481 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d6d035-9129-459e-b913-95a87f868196" containerName="route-controller-manager" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.177488 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d6d035-9129-459e-b913-95a87f868196" containerName="route-controller-manager" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.177572 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d6d035-9129-459e-b913-95a87f868196" containerName="route-controller-manager" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.177583 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5de67fc-ba65-4752-b10a-86149771384a" containerName="controller-manager" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.177594 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.177941 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.191722 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-895789bdc-vlc5p"] Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.215093 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-proxy-ca-bundles\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.215152 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-config\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.215276 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z4lq\" (UniqueName: \"kubernetes.io/projected/1bc8da59-541d-42b3-8fa2-7f0394eec028-kube-api-access-8z4lq\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.215310 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-client-ca\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.215409 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bc8da59-541d-42b3-8fa2-7f0394eec028-serving-cert\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.316396 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-proxy-ca-bundles\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.316458 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-config\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.316516 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z4lq\" (UniqueName: \"kubernetes.io/projected/1bc8da59-541d-42b3-8fa2-7f0394eec028-kube-api-access-8z4lq\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.316550 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-client-ca\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.316604 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bc8da59-541d-42b3-8fa2-7f0394eec028-serving-cert\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.317918 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-proxy-ca-bundles\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.318361 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-config\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.318503 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-client-ca\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.323291 4848 generic.go:334] "Generic (PLEG): container finished" podID="b1d6d035-9129-459e-b913-95a87f868196" containerID="140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f" exitCode=0 Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.323406 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" event={"ID":"b1d6d035-9129-459e-b913-95a87f868196","Type":"ContainerDied","Data":"140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f"} Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.323463 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.323497 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s" event={"ID":"b1d6d035-9129-459e-b913-95a87f868196","Type":"ContainerDied","Data":"2dbc467cbdeae5ab30075f1090dd7e42ca731b8ef9d84600197bc0b5c7c70e8b"} Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.323531 4848 scope.go:117] "RemoveContainer" containerID="140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.324659 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bc8da59-541d-42b3-8fa2-7f0394eec028-serving-cert\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.325477 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" event={"ID":"b5de67fc-ba65-4752-b10a-86149771384a","Type":"ContainerDied","Data":"65d808b5ec4545cd4e97f4f65b3adb6470c773403a02a6424001c53b61d3ded0"} Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.325487 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8jq72" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.335810 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z4lq\" (UniqueName: \"kubernetes.io/projected/1bc8da59-541d-42b3-8fa2-7f0394eec028-kube-api-access-8z4lq\") pod \"controller-manager-895789bdc-vlc5p\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.390335 4848 scope.go:117] "RemoveContainer" containerID="140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f" Dec 06 15:33:25 crc kubenswrapper[4848]: E1206 15:33:25.391188 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f\": container with ID starting with 140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f not found: ID does not exist" containerID="140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.391226 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f"} err="failed to get container status \"140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f\": rpc error: code = NotFound desc = could not find container \"140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f\": container with ID starting with 140f48fdb16da751bc1f93f383505eed9482b81bbafd44e519069f4a484c568f not found: ID does not exist" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.391252 4848 scope.go:117] "RemoveContainer" containerID="9ca702a505344b994f1d4cf8b806dcf50e7333e1796c4fce986c80b65af57e18" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.392835 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8jq72"] Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.396435 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8jq72"] Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.402512 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s"] Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.405356 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-48n2s"] Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.494008 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:25 crc kubenswrapper[4848]: I1206 15:33:25.897279 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-895789bdc-vlc5p"] Dec 06 15:33:25 crc kubenswrapper[4848]: W1206 15:33:25.904870 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bc8da59_541d_42b3_8fa2_7f0394eec028.slice/crio-ed874e7cb3879d40746cd3c460b3a23473ddc89ac4c4df405b6a9c1bc0fd4cdd WatchSource:0}: Error finding container ed874e7cb3879d40746cd3c460b3a23473ddc89ac4c4df405b6a9c1bc0fd4cdd: Status 404 returned error can't find the container with id ed874e7cb3879d40746cd3c460b3a23473ddc89ac4c4df405b6a9c1bc0fd4cdd Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.180700 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz"] Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.181823 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.185048 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.185478 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.186454 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.186636 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.187650 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.189424 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.196144 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz"] Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.226370 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-config\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.226865 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-serving-cert\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.227071 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4st4\" (UniqueName: \"kubernetes.io/projected/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-kube-api-access-g4st4\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.227313 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-client-ca\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.328293 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-serving-cert\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.328605 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4st4\" (UniqueName: \"kubernetes.io/projected/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-kube-api-access-g4st4\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.328788 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-client-ca\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.328935 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-config\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.329659 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-client-ca\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.329987 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-config\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.333976 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" event={"ID":"1bc8da59-541d-42b3-8fa2-7f0394eec028","Type":"ContainerStarted","Data":"44f818ae709ede50f536e5e2f8e8ca5eab5c2d145e3ea6f46525df39d972569b"} Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.334019 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" event={"ID":"1bc8da59-541d-42b3-8fa2-7f0394eec028","Type":"ContainerStarted","Data":"ed874e7cb3879d40746cd3c460b3a23473ddc89ac4c4df405b6a9c1bc0fd4cdd"} Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.334896 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.335840 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-serving-cert\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.351207 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.351529 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4st4\" (UniqueName: \"kubernetes.io/projected/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-kube-api-access-g4st4\") pod \"route-controller-manager-59484c89-r8jmz\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.357217 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" podStartSLOduration=2.357181284 podStartE2EDuration="2.357181284s" podCreationTimestamp="2025-12-06 15:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:33:26.353104538 +0000 UTC m=+273.651115461" watchObservedRunningTime="2025-12-06 15:33:26.357181284 +0000 UTC m=+273.655192207" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.500185 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.900425 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz"] Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.973530 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d6d035-9129-459e-b913-95a87f868196" path="/var/lib/kubelet/pods/b1d6d035-9129-459e-b913-95a87f868196/volumes" Dec 06 15:33:26 crc kubenswrapper[4848]: I1206 15:33:26.974646 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5de67fc-ba65-4752-b10a-86149771384a" path="/var/lib/kubelet/pods/b5de67fc-ba65-4752-b10a-86149771384a/volumes" Dec 06 15:33:27 crc kubenswrapper[4848]: I1206 15:33:27.343002 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" event={"ID":"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a","Type":"ContainerStarted","Data":"92a0adbd814cfd3073e60613776f32a98c4df36bfc1682913cc2eb0ade57cdf9"} Dec 06 15:33:27 crc kubenswrapper[4848]: I1206 15:33:27.343049 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" event={"ID":"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a","Type":"ContainerStarted","Data":"b9629e3dd1516c0b674c34d8e3c3156c75e8e73c11b8b8a84edef62bc7c36931"} Dec 06 15:33:27 crc kubenswrapper[4848]: I1206 15:33:27.362241 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" podStartSLOduration=3.36222518 podStartE2EDuration="3.36222518s" podCreationTimestamp="2025-12-06 15:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:33:27.360445622 +0000 UTC m=+274.658456545" watchObservedRunningTime="2025-12-06 15:33:27.36222518 +0000 UTC m=+274.660236093" Dec 06 15:33:28 crc kubenswrapper[4848]: I1206 15:33:28.347759 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:28 crc kubenswrapper[4848]: I1206 15:33:28.354233 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:44 crc kubenswrapper[4848]: I1206 15:33:44.942118 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-895789bdc-vlc5p"] Dec 06 15:33:44 crc kubenswrapper[4848]: I1206 15:33:44.943231 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" podUID="1bc8da59-541d-42b3-8fa2-7f0394eec028" containerName="controller-manager" containerID="cri-o://44f818ae709ede50f536e5e2f8e8ca5eab5c2d145e3ea6f46525df39d972569b" gracePeriod=30 Dec 06 15:33:44 crc kubenswrapper[4848]: I1206 15:33:44.963946 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz"] Dec 06 15:33:44 crc kubenswrapper[4848]: I1206 15:33:44.964525 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" podUID="9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a" containerName="route-controller-manager" containerID="cri-o://92a0adbd814cfd3073e60613776f32a98c4df36bfc1682913cc2eb0ade57cdf9" gracePeriod=30 Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.448747 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a" containerID="92a0adbd814cfd3073e60613776f32a98c4df36bfc1682913cc2eb0ade57cdf9" exitCode=0 Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.448934 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" event={"ID":"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a","Type":"ContainerDied","Data":"92a0adbd814cfd3073e60613776f32a98c4df36bfc1682913cc2eb0ade57cdf9"} Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.449162 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" event={"ID":"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a","Type":"ContainerDied","Data":"b9629e3dd1516c0b674c34d8e3c3156c75e8e73c11b8b8a84edef62bc7c36931"} Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.449188 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9629e3dd1516c0b674c34d8e3c3156c75e8e73c11b8b8a84edef62bc7c36931" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.450849 4848 generic.go:334] "Generic (PLEG): container finished" podID="1bc8da59-541d-42b3-8fa2-7f0394eec028" containerID="44f818ae709ede50f536e5e2f8e8ca5eab5c2d145e3ea6f46525df39d972569b" exitCode=0 Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.450906 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" event={"ID":"1bc8da59-541d-42b3-8fa2-7f0394eec028","Type":"ContainerDied","Data":"44f818ae709ede50f536e5e2f8e8ca5eab5c2d145e3ea6f46525df39d972569b"} Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.473130 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.477298 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.550340 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-client-ca\") pod \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.550421 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z4lq\" (UniqueName: \"kubernetes.io/projected/1bc8da59-541d-42b3-8fa2-7f0394eec028-kube-api-access-8z4lq\") pod \"1bc8da59-541d-42b3-8fa2-7f0394eec028\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.550455 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-serving-cert\") pod \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.550478 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-config\") pod \"1bc8da59-541d-42b3-8fa2-7f0394eec028\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.550499 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bc8da59-541d-42b3-8fa2-7f0394eec028-serving-cert\") pod \"1bc8da59-541d-42b3-8fa2-7f0394eec028\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.550538 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-config\") pod \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.550564 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-proxy-ca-bundles\") pod \"1bc8da59-541d-42b3-8fa2-7f0394eec028\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.550595 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-client-ca\") pod \"1bc8da59-541d-42b3-8fa2-7f0394eec028\" (UID: \"1bc8da59-541d-42b3-8fa2-7f0394eec028\") " Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.550623 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4st4\" (UniqueName: \"kubernetes.io/projected/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-kube-api-access-g4st4\") pod \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\" (UID: \"9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a\") " Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.551277 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-client-ca" (OuterVolumeSpecName: "client-ca") pod "1bc8da59-541d-42b3-8fa2-7f0394eec028" (UID: "1bc8da59-541d-42b3-8fa2-7f0394eec028"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.551346 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-config" (OuterVolumeSpecName: "config") pod "9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a" (UID: "9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.551373 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-config" (OuterVolumeSpecName: "config") pod "1bc8da59-541d-42b3-8fa2-7f0394eec028" (UID: "1bc8da59-541d-42b3-8fa2-7f0394eec028"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.552007 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-client-ca" (OuterVolumeSpecName: "client-ca") pod "9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a" (UID: "9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.552031 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1bc8da59-541d-42b3-8fa2-7f0394eec028" (UID: "1bc8da59-541d-42b3-8fa2-7f0394eec028"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.555369 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a" (UID: "9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.555389 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-kube-api-access-g4st4" (OuterVolumeSpecName: "kube-api-access-g4st4") pod "9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a" (UID: "9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a"). InnerVolumeSpecName "kube-api-access-g4st4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.555558 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc8da59-541d-42b3-8fa2-7f0394eec028-kube-api-access-8z4lq" (OuterVolumeSpecName: "kube-api-access-8z4lq") pod "1bc8da59-541d-42b3-8fa2-7f0394eec028" (UID: "1bc8da59-541d-42b3-8fa2-7f0394eec028"). InnerVolumeSpecName "kube-api-access-8z4lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.555604 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc8da59-541d-42b3-8fa2-7f0394eec028-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bc8da59-541d-42b3-8fa2-7f0394eec028" (UID: "1bc8da59-541d-42b3-8fa2-7f0394eec028"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.651192 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.651495 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.651505 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.651515 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4st4\" (UniqueName: \"kubernetes.io/projected/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-kube-api-access-g4st4\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.651525 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.651534 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z4lq\" (UniqueName: \"kubernetes.io/projected/1bc8da59-541d-42b3-8fa2-7f0394eec028-kube-api-access-8z4lq\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.651542 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.651550 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc8da59-541d-42b3-8fa2-7f0394eec028-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:45 crc kubenswrapper[4848]: I1206 15:33:45.651558 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bc8da59-541d-42b3-8fa2-7f0394eec028-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.187733 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt"] Dec 06 15:33:46 crc kubenswrapper[4848]: E1206 15:33:46.188072 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a" containerName="route-controller-manager" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.188087 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a" containerName="route-controller-manager" Dec 06 15:33:46 crc kubenswrapper[4848]: E1206 15:33:46.188136 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc8da59-541d-42b3-8fa2-7f0394eec028" containerName="controller-manager" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.188145 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc8da59-541d-42b3-8fa2-7f0394eec028" containerName="controller-manager" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.188321 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc8da59-541d-42b3-8fa2-7f0394eec028" containerName="controller-manager" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.188335 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a" containerName="route-controller-manager" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.188988 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.190885 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75f89bf7fc-mclcq"] Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.191310 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.199909 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt"] Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.204332 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75f89bf7fc-mclcq"] Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.257089 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqn22\" (UniqueName: \"kubernetes.io/projected/3e96efda-0445-49b4-8769-a2cc961e2088-kube-api-access-rqn22\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.257134 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-config\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.257166 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-config\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.257189 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8z5n\" (UniqueName: \"kubernetes.io/projected/9f0b800c-85d8-4904-b898-5f769c5733fb-kube-api-access-g8z5n\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.257212 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-client-ca\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.257303 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-proxy-ca-bundles\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.257362 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-client-ca\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.257455 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0b800c-85d8-4904-b898-5f769c5733fb-serving-cert\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.257527 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e96efda-0445-49b4-8769-a2cc961e2088-serving-cert\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.358639 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0b800c-85d8-4904-b898-5f769c5733fb-serving-cert\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.359290 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e96efda-0445-49b4-8769-a2cc961e2088-serving-cert\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.359332 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqn22\" (UniqueName: \"kubernetes.io/projected/3e96efda-0445-49b4-8769-a2cc961e2088-kube-api-access-rqn22\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.359361 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-config\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.359401 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-config\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.359436 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8z5n\" (UniqueName: \"kubernetes.io/projected/9f0b800c-85d8-4904-b898-5f769c5733fb-kube-api-access-g8z5n\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.359462 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-client-ca\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.359489 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-proxy-ca-bundles\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.359518 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-client-ca\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.360397 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-client-ca\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.360775 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-client-ca\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.360852 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-proxy-ca-bundles\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.360996 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-config\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.361212 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-config\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.363301 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0b800c-85d8-4904-b898-5f769c5733fb-serving-cert\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.363300 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e96efda-0445-49b4-8769-a2cc961e2088-serving-cert\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.375339 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqn22\" (UniqueName: \"kubernetes.io/projected/3e96efda-0445-49b4-8769-a2cc961e2088-kube-api-access-rqn22\") pod \"controller-manager-75f89bf7fc-mclcq\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.378736 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8z5n\" (UniqueName: \"kubernetes.io/projected/9f0b800c-85d8-4904-b898-5f769c5733fb-kube-api-access-g8z5n\") pod \"route-controller-manager-c5d756cbc-67jrt\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.458238 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.458244 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-895789bdc-vlc5p" event={"ID":"1bc8da59-541d-42b3-8fa2-7f0394eec028","Type":"ContainerDied","Data":"ed874e7cb3879d40746cd3c460b3a23473ddc89ac4c4df405b6a9c1bc0fd4cdd"} Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.458303 4848 scope.go:117] "RemoveContainer" containerID="44f818ae709ede50f536e5e2f8e8ca5eab5c2d145e3ea6f46525df39d972569b" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.458259 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.489862 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-895789bdc-vlc5p"] Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.494582 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-895789bdc-vlc5p"] Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.499833 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz"] Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.501812 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59484c89-r8jmz"] Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.508299 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.519239 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.886430 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75f89bf7fc-mclcq"] Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.934989 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt"] Dec 06 15:33:46 crc kubenswrapper[4848]: W1206 15:33:46.946972 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f0b800c_85d8_4904_b898_5f769c5733fb.slice/crio-0581260913a886bf363fdd451cb66d3abb08f11d970ce25a841bba38a948fbab WatchSource:0}: Error finding container 0581260913a886bf363fdd451cb66d3abb08f11d970ce25a841bba38a948fbab: Status 404 returned error can't find the container with id 0581260913a886bf363fdd451cb66d3abb08f11d970ce25a841bba38a948fbab Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.972116 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc8da59-541d-42b3-8fa2-7f0394eec028" path="/var/lib/kubelet/pods/1bc8da59-541d-42b3-8fa2-7f0394eec028/volumes" Dec 06 15:33:46 crc kubenswrapper[4848]: I1206 15:33:46.972720 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a" path="/var/lib/kubelet/pods/9f072ce6-da3e-402a-bc3a-e8bb2f6e3f9a/volumes" Dec 06 15:33:47 crc kubenswrapper[4848]: I1206 15:33:47.465007 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" event={"ID":"9f0b800c-85d8-4904-b898-5f769c5733fb","Type":"ContainerStarted","Data":"f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8"} Dec 06 15:33:47 crc kubenswrapper[4848]: I1206 15:33:47.465341 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" event={"ID":"9f0b800c-85d8-4904-b898-5f769c5733fb","Type":"ContainerStarted","Data":"0581260913a886bf363fdd451cb66d3abb08f11d970ce25a841bba38a948fbab"} Dec 06 15:33:47 crc kubenswrapper[4848]: I1206 15:33:47.465373 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:47 crc kubenswrapper[4848]: I1206 15:33:47.469529 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" event={"ID":"3e96efda-0445-49b4-8769-a2cc961e2088","Type":"ContainerStarted","Data":"7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed"} Dec 06 15:33:47 crc kubenswrapper[4848]: I1206 15:33:47.469569 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" event={"ID":"3e96efda-0445-49b4-8769-a2cc961e2088","Type":"ContainerStarted","Data":"8c980a17abdadaef39a36b3f4a9c115184b170dab33c21842ab2798336b775c1"} Dec 06 15:33:47 crc kubenswrapper[4848]: I1206 15:33:47.469747 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:47 crc kubenswrapper[4848]: I1206 15:33:47.476197 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:47 crc kubenswrapper[4848]: I1206 15:33:47.487253 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" podStartSLOduration=3.487236302 podStartE2EDuration="3.487236302s" podCreationTimestamp="2025-12-06 15:33:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:33:47.48641667 +0000 UTC m=+294.784427593" watchObservedRunningTime="2025-12-06 15:33:47.487236302 +0000 UTC m=+294.785247215" Dec 06 15:33:47 crc kubenswrapper[4848]: I1206 15:33:47.526893 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" podStartSLOduration=3.526871941 podStartE2EDuration="3.526871941s" podCreationTimestamp="2025-12-06 15:33:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:33:47.514999241 +0000 UTC m=+294.813010154" watchObservedRunningTime="2025-12-06 15:33:47.526871941 +0000 UTC m=+294.824882864" Dec 06 15:33:47 crc kubenswrapper[4848]: I1206 15:33:47.983400 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:48 crc kubenswrapper[4848]: I1206 15:33:48.105929 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75f89bf7fc-mclcq"] Dec 06 15:33:48 crc kubenswrapper[4848]: I1206 15:33:48.127588 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt"] Dec 06 15:33:49 crc kubenswrapper[4848]: I1206 15:33:49.479039 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" podUID="9f0b800c-85d8-4904-b898-5f769c5733fb" containerName="route-controller-manager" containerID="cri-o://f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8" gracePeriod=30 Dec 06 15:33:49 crc kubenswrapper[4848]: I1206 15:33:49.479159 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" podUID="3e96efda-0445-49b4-8769-a2cc961e2088" containerName="controller-manager" containerID="cri-o://7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed" gracePeriod=30 Dec 06 15:33:49 crc kubenswrapper[4848]: I1206 15:33:49.894998 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:49 crc kubenswrapper[4848]: I1206 15:33:49.918776 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn"] Dec 06 15:33:49 crc kubenswrapper[4848]: E1206 15:33:49.919013 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0b800c-85d8-4904-b898-5f769c5733fb" containerName="route-controller-manager" Dec 06 15:33:49 crc kubenswrapper[4848]: I1206 15:33:49.919026 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0b800c-85d8-4904-b898-5f769c5733fb" containerName="route-controller-manager" Dec 06 15:33:49 crc kubenswrapper[4848]: I1206 15:33:49.919163 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0b800c-85d8-4904-b898-5f769c5733fb" containerName="route-controller-manager" Dec 06 15:33:49 crc kubenswrapper[4848]: I1206 15:33:49.919790 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:49 crc kubenswrapper[4848]: I1206 15:33:49.932562 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn"] Dec 06 15:33:49 crc kubenswrapper[4848]: I1206 15:33:49.961788 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001483 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-client-ca\") pod \"9f0b800c-85d8-4904-b898-5f769c5733fb\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001537 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0b800c-85d8-4904-b898-5f769c5733fb-serving-cert\") pod \"9f0b800c-85d8-4904-b898-5f769c5733fb\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001570 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e96efda-0445-49b4-8769-a2cc961e2088-serving-cert\") pod \"3e96efda-0445-49b4-8769-a2cc961e2088\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001613 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8z5n\" (UniqueName: \"kubernetes.io/projected/9f0b800c-85d8-4904-b898-5f769c5733fb-kube-api-access-g8z5n\") pod \"9f0b800c-85d8-4904-b898-5f769c5733fb\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001644 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-config\") pod \"3e96efda-0445-49b4-8769-a2cc961e2088\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001669 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-config\") pod \"9f0b800c-85d8-4904-b898-5f769c5733fb\" (UID: \"9f0b800c-85d8-4904-b898-5f769c5733fb\") " Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001728 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-proxy-ca-bundles\") pod \"3e96efda-0445-49b4-8769-a2cc961e2088\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001748 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqn22\" (UniqueName: \"kubernetes.io/projected/3e96efda-0445-49b4-8769-a2cc961e2088-kube-api-access-rqn22\") pod \"3e96efda-0445-49b4-8769-a2cc961e2088\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001770 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-client-ca\") pod \"3e96efda-0445-49b4-8769-a2cc961e2088\" (UID: \"3e96efda-0445-49b4-8769-a2cc961e2088\") " Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001934 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-serving-cert\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001954 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-config\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001976 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-client-ca\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.001998 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgl5r\" (UniqueName: \"kubernetes.io/projected/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-kube-api-access-kgl5r\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.002676 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-client-ca" (OuterVolumeSpecName: "client-ca") pod "9f0b800c-85d8-4904-b898-5f769c5733fb" (UID: "9f0b800c-85d8-4904-b898-5f769c5733fb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.003607 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3e96efda-0445-49b4-8769-a2cc961e2088" (UID: "3e96efda-0445-49b4-8769-a2cc961e2088"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.008658 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f0b800c-85d8-4904-b898-5f769c5733fb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9f0b800c-85d8-4904-b898-5f769c5733fb" (UID: "9f0b800c-85d8-4904-b898-5f769c5733fb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.009790 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e96efda-0445-49b4-8769-a2cc961e2088-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3e96efda-0445-49b4-8769-a2cc961e2088" (UID: "3e96efda-0445-49b4-8769-a2cc961e2088"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.013034 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0b800c-85d8-4904-b898-5f769c5733fb-kube-api-access-g8z5n" (OuterVolumeSpecName: "kube-api-access-g8z5n") pod "9f0b800c-85d8-4904-b898-5f769c5733fb" (UID: "9f0b800c-85d8-4904-b898-5f769c5733fb"). InnerVolumeSpecName "kube-api-access-g8z5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.013089 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e96efda-0445-49b4-8769-a2cc961e2088-kube-api-access-rqn22" (OuterVolumeSpecName: "kube-api-access-rqn22") pod "3e96efda-0445-49b4-8769-a2cc961e2088" (UID: "3e96efda-0445-49b4-8769-a2cc961e2088"). InnerVolumeSpecName "kube-api-access-rqn22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.013563 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-client-ca" (OuterVolumeSpecName: "client-ca") pod "3e96efda-0445-49b4-8769-a2cc961e2088" (UID: "3e96efda-0445-49b4-8769-a2cc961e2088"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.013785 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-config" (OuterVolumeSpecName: "config") pod "3e96efda-0445-49b4-8769-a2cc961e2088" (UID: "3e96efda-0445-49b4-8769-a2cc961e2088"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.014207 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-config" (OuterVolumeSpecName: "config") pod "9f0b800c-85d8-4904-b898-5f769c5733fb" (UID: "9f0b800c-85d8-4904-b898-5f769c5733fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.104174 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-serving-cert\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.104769 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-config\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.104811 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-client-ca\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.104845 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgl5r\" (UniqueName: \"kubernetes.io/projected/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-kube-api-access-kgl5r\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.104913 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8z5n\" (UniqueName: \"kubernetes.io/projected/9f0b800c-85d8-4904-b898-5f769c5733fb-kube-api-access-g8z5n\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.104930 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.104943 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.104955 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.104969 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqn22\" (UniqueName: \"kubernetes.io/projected/3e96efda-0445-49b4-8769-a2cc961e2088-kube-api-access-rqn22\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.104980 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e96efda-0445-49b4-8769-a2cc961e2088-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.105037 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0b800c-85d8-4904-b898-5f769c5733fb-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.105048 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0b800c-85d8-4904-b898-5f769c5733fb-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.105059 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e96efda-0445-49b4-8769-a2cc961e2088-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.106076 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-client-ca\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.106238 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-config\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.107698 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-serving-cert\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.123103 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgl5r\" (UniqueName: \"kubernetes.io/projected/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-kube-api-access-kgl5r\") pod \"route-controller-manager-794fbfc59b-kfstn\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.260106 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.485616 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f0b800c-85d8-4904-b898-5f769c5733fb" containerID="f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8" exitCode=0 Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.485670 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.485668 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" event={"ID":"9f0b800c-85d8-4904-b898-5f769c5733fb","Type":"ContainerDied","Data":"f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8"} Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.486138 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt" event={"ID":"9f0b800c-85d8-4904-b898-5f769c5733fb","Type":"ContainerDied","Data":"0581260913a886bf363fdd451cb66d3abb08f11d970ce25a841bba38a948fbab"} Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.486214 4848 scope.go:117] "RemoveContainer" containerID="f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.487524 4848 generic.go:334] "Generic (PLEG): container finished" podID="3e96efda-0445-49b4-8769-a2cc961e2088" containerID="7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed" exitCode=0 Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.487545 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" event={"ID":"3e96efda-0445-49b4-8769-a2cc961e2088","Type":"ContainerDied","Data":"7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed"} Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.487559 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" event={"ID":"3e96efda-0445-49b4-8769-a2cc961e2088","Type":"ContainerDied","Data":"8c980a17abdadaef39a36b3f4a9c115184b170dab33c21842ab2798336b775c1"} Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.487554 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mclcq" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.504206 4848 scope.go:117] "RemoveContainer" containerID="f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8" Dec 06 15:33:50 crc kubenswrapper[4848]: E1206 15:33:50.504676 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8\": container with ID starting with f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8 not found: ID does not exist" containerID="f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.504715 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8"} err="failed to get container status \"f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8\": rpc error: code = NotFound desc = could not find container \"f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8\": container with ID starting with f183da3258ede8e30fe058ec43e761043fd4fc48c56f7ad3fb3e04ef92bad1d8 not found: ID does not exist" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.504734 4848 scope.go:117] "RemoveContainer" containerID="7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.523809 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt"] Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.525666 4848 scope.go:117] "RemoveContainer" containerID="7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed" Dec 06 15:33:50 crc kubenswrapper[4848]: E1206 15:33:50.530943 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed\": container with ID starting with 7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed not found: ID does not exist" containerID="7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.531182 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed"} err="failed to get container status \"7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed\": rpc error: code = NotFound desc = could not find container \"7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed\": container with ID starting with 7e7cfa4a6523b2f01c33f58d2c580ff00e61c792a5af7f19b86f341ce11207ed not found: ID does not exist" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.532605 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5d756cbc-67jrt"] Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.536247 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75f89bf7fc-mclcq"] Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.539215 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75f89bf7fc-mclcq"] Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.627813 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn"] Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.973348 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e96efda-0445-49b4-8769-a2cc961e2088" path="/var/lib/kubelet/pods/3e96efda-0445-49b4-8769-a2cc961e2088/volumes" Dec 06 15:33:50 crc kubenswrapper[4848]: I1206 15:33:50.973879 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0b800c-85d8-4904-b898-5f769c5733fb" path="/var/lib/kubelet/pods/9f0b800c-85d8-4904-b898-5f769c5733fb/volumes" Dec 06 15:33:51 crc kubenswrapper[4848]: I1206 15:33:51.497684 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" event={"ID":"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c","Type":"ContainerStarted","Data":"9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1"} Dec 06 15:33:51 crc kubenswrapper[4848]: I1206 15:33:51.498033 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" event={"ID":"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c","Type":"ContainerStarted","Data":"42af178ee157ff1636738dcd848dd40dd37eb86b9e90142a7c983e8a059b5e35"} Dec 06 15:33:51 crc kubenswrapper[4848]: I1206 15:33:51.498325 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:51 crc kubenswrapper[4848]: I1206 15:33:51.506631 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:33:51 crc kubenswrapper[4848]: I1206 15:33:51.516945 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" podStartSLOduration=3.516917502 podStartE2EDuration="3.516917502s" podCreationTimestamp="2025-12-06 15:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:33:51.51500757 +0000 UTC m=+298.813018553" watchObservedRunningTime="2025-12-06 15:33:51.516917502 +0000 UTC m=+298.814928465" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.197307 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66f9447cbb-4lpjs"] Dec 06 15:33:52 crc kubenswrapper[4848]: E1206 15:33:52.198104 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e96efda-0445-49b4-8769-a2cc961e2088" containerName="controller-manager" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.198149 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e96efda-0445-49b4-8769-a2cc961e2088" containerName="controller-manager" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.198413 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e96efda-0445-49b4-8769-a2cc961e2088" containerName="controller-manager" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.199314 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.202437 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.204808 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.204997 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.205136 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.208624 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.209656 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.210573 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66f9447cbb-4lpjs"] Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.215415 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.233647 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-config\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.233709 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-client-ca\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.233743 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-serving-cert\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.233762 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2fnz\" (UniqueName: \"kubernetes.io/projected/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-kube-api-access-c2fnz\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.233916 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-proxy-ca-bundles\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.334262 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-config\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.334303 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-client-ca\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.334331 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-serving-cert\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.334350 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2fnz\" (UniqueName: \"kubernetes.io/projected/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-kube-api-access-c2fnz\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.334368 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-proxy-ca-bundles\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.335495 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-proxy-ca-bundles\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.335960 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-client-ca\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.336127 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-config\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.340969 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-serving-cert\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.351415 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2fnz\" (UniqueName: \"kubernetes.io/projected/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-kube-api-access-c2fnz\") pod \"controller-manager-66f9447cbb-4lpjs\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.520663 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:52 crc kubenswrapper[4848]: I1206 15:33:52.753057 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66f9447cbb-4lpjs"] Dec 06 15:33:52 crc kubenswrapper[4848]: W1206 15:33:52.763234 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d522f2a_31d4_4dbd_aa4d_d74c6dd958f0.slice/crio-57af629b83a507f7e48a05b5b1463b9ba05221b342670d448c5dd4cebda19fb8 WatchSource:0}: Error finding container 57af629b83a507f7e48a05b5b1463b9ba05221b342670d448c5dd4cebda19fb8: Status 404 returned error can't find the container with id 57af629b83a507f7e48a05b5b1463b9ba05221b342670d448c5dd4cebda19fb8 Dec 06 15:33:53 crc kubenswrapper[4848]: I1206 15:33:53.509334 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" event={"ID":"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0","Type":"ContainerStarted","Data":"bc54ea22bc6629e8969efd9b906344c1ab9dcef200e5983a0980adddf06c20b3"} Dec 06 15:33:53 crc kubenswrapper[4848]: I1206 15:33:53.509554 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" event={"ID":"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0","Type":"ContainerStarted","Data":"57af629b83a507f7e48a05b5b1463b9ba05221b342670d448c5dd4cebda19fb8"} Dec 06 15:33:53 crc kubenswrapper[4848]: I1206 15:33:53.528348 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" podStartSLOduration=5.528331565 podStartE2EDuration="5.528331565s" podCreationTimestamp="2025-12-06 15:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:33:53.524886341 +0000 UTC m=+300.822897274" watchObservedRunningTime="2025-12-06 15:33:53.528331565 +0000 UTC m=+300.826342478" Dec 06 15:33:54 crc kubenswrapper[4848]: I1206 15:33:54.514445 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:33:54 crc kubenswrapper[4848]: I1206 15:33:54.520384 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:34:04 crc kubenswrapper[4848]: I1206 15:34:04.951198 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66f9447cbb-4lpjs"] Dec 06 15:34:04 crc kubenswrapper[4848]: I1206 15:34:04.952113 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" podUID="7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0" containerName="controller-manager" containerID="cri-o://bc54ea22bc6629e8969efd9b906344c1ab9dcef200e5983a0980adddf06c20b3" gracePeriod=30 Dec 06 15:34:04 crc kubenswrapper[4848]: I1206 15:34:04.971440 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn"] Dec 06 15:34:04 crc kubenswrapper[4848]: I1206 15:34:04.971625 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" podUID="d670fa7b-0bd4-48ea-ba7e-a8b796afb72c" containerName="route-controller-manager" containerID="cri-o://9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1" gracePeriod=30 Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.515613 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.611481 4848 generic.go:334] "Generic (PLEG): container finished" podID="7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0" containerID="bc54ea22bc6629e8969efd9b906344c1ab9dcef200e5983a0980adddf06c20b3" exitCode=0 Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.611568 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" event={"ID":"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0","Type":"ContainerDied","Data":"bc54ea22bc6629e8969efd9b906344c1ab9dcef200e5983a0980adddf06c20b3"} Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.612758 4848 generic.go:334] "Generic (PLEG): container finished" podID="d670fa7b-0bd4-48ea-ba7e-a8b796afb72c" containerID="9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1" exitCode=0 Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.612808 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" event={"ID":"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c","Type":"ContainerDied","Data":"9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1"} Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.612828 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" event={"ID":"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c","Type":"ContainerDied","Data":"42af178ee157ff1636738dcd848dd40dd37eb86b9e90142a7c983e8a059b5e35"} Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.612848 4848 scope.go:117] "RemoveContainer" containerID="9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.612969 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.628780 4848 scope.go:117] "RemoveContainer" containerID="9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1" Dec 06 15:34:05 crc kubenswrapper[4848]: E1206 15:34:05.629412 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1\": container with ID starting with 9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1 not found: ID does not exist" containerID="9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.629439 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1"} err="failed to get container status \"9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1\": rpc error: code = NotFound desc = could not find container \"9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1\": container with ID starting with 9b89e5450c97666c353a4f75635ccce80c645601315cc35adf132a9d2278aea1 not found: ID does not exist" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.637573 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgl5r\" (UniqueName: \"kubernetes.io/projected/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-kube-api-access-kgl5r\") pod \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.637664 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-config\") pod \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.637756 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-serving-cert\") pod \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.637809 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-client-ca\") pod \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\" (UID: \"d670fa7b-0bd4-48ea-ba7e-a8b796afb72c\") " Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.638823 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-config" (OuterVolumeSpecName: "config") pod "d670fa7b-0bd4-48ea-ba7e-a8b796afb72c" (UID: "d670fa7b-0bd4-48ea-ba7e-a8b796afb72c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.638839 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-client-ca" (OuterVolumeSpecName: "client-ca") pod "d670fa7b-0bd4-48ea-ba7e-a8b796afb72c" (UID: "d670fa7b-0bd4-48ea-ba7e-a8b796afb72c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.643874 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d670fa7b-0bd4-48ea-ba7e-a8b796afb72c" (UID: "d670fa7b-0bd4-48ea-ba7e-a8b796afb72c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.644707 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-kube-api-access-kgl5r" (OuterVolumeSpecName: "kube-api-access-kgl5r") pod "d670fa7b-0bd4-48ea-ba7e-a8b796afb72c" (UID: "d670fa7b-0bd4-48ea-ba7e-a8b796afb72c"). InnerVolumeSpecName "kube-api-access-kgl5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.739370 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.739431 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.739461 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.739488 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgl5r\" (UniqueName: \"kubernetes.io/projected/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c-kube-api-access-kgl5r\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.927115 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.936106 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn"] Dec 06 15:34:05 crc kubenswrapper[4848]: I1206 15:34:05.938811 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-794fbfc59b-kfstn"] Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.044667 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2fnz\" (UniqueName: \"kubernetes.io/projected/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-kube-api-access-c2fnz\") pod \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.044789 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-config\") pod \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.044844 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-proxy-ca-bundles\") pod \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.044909 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-client-ca\") pod \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.045030 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-serving-cert\") pod \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\" (UID: \"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0\") " Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.045614 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0" (UID: "7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.046019 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-config" (OuterVolumeSpecName: "config") pod "7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0" (UID: "7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.046084 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0" (UID: "7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.048409 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0" (UID: "7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.048563 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-kube-api-access-c2fnz" (OuterVolumeSpecName: "kube-api-access-c2fnz") pod "7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0" (UID: "7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0"). InnerVolumeSpecName "kube-api-access-c2fnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.145968 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.146010 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2fnz\" (UniqueName: \"kubernetes.io/projected/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-kube-api-access-c2fnz\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.146021 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.146031 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.146043 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.211229 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n"] Dec 06 15:34:06 crc kubenswrapper[4848]: E1206 15:34:06.212687 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0" containerName="controller-manager" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.212795 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0" containerName="controller-manager" Dec 06 15:34:06 crc kubenswrapper[4848]: E1206 15:34:06.212824 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d670fa7b-0bd4-48ea-ba7e-a8b796afb72c" containerName="route-controller-manager" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.212846 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d670fa7b-0bd4-48ea-ba7e-a8b796afb72c" containerName="route-controller-manager" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.213204 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0" containerName="controller-manager" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.213264 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="d670fa7b-0bd4-48ea-ba7e-a8b796afb72c" containerName="route-controller-manager" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.214598 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.216434 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p"] Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.217618 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.219980 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.220020 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.220027 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.222295 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.222375 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.223378 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.227955 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n"] Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.233405 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p"] Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.246817 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c9396dc-f238-4d68-8306-5f22dc8abfcb-serving-cert\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.246860 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f2bd50b-7099-4417-8986-0bbee1c1a413-serving-cert\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.246889 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c9396dc-f238-4d68-8306-5f22dc8abfcb-client-ca\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.246921 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c9396dc-f238-4d68-8306-5f22dc8abfcb-config\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.246942 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f2bd50b-7099-4417-8986-0bbee1c1a413-client-ca\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.247026 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5vhh\" (UniqueName: \"kubernetes.io/projected/0f2bd50b-7099-4417-8986-0bbee1c1a413-kube-api-access-q5vhh\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.247105 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2bd50b-7099-4417-8986-0bbee1c1a413-config\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.247144 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c9396dc-f238-4d68-8306-5f22dc8abfcb-proxy-ca-bundles\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.247195 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7lgh\" (UniqueName: \"kubernetes.io/projected/3c9396dc-f238-4d68-8306-5f22dc8abfcb-kube-api-access-c7lgh\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.347983 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7lgh\" (UniqueName: \"kubernetes.io/projected/3c9396dc-f238-4d68-8306-5f22dc8abfcb-kube-api-access-c7lgh\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.348059 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c9396dc-f238-4d68-8306-5f22dc8abfcb-serving-cert\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.348346 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f2bd50b-7099-4417-8986-0bbee1c1a413-serving-cert\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.348398 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c9396dc-f238-4d68-8306-5f22dc8abfcb-client-ca\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.348461 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c9396dc-f238-4d68-8306-5f22dc8abfcb-config\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.348511 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f2bd50b-7099-4417-8986-0bbee1c1a413-client-ca\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.348553 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5vhh\" (UniqueName: \"kubernetes.io/projected/0f2bd50b-7099-4417-8986-0bbee1c1a413-kube-api-access-q5vhh\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.348607 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2bd50b-7099-4417-8986-0bbee1c1a413-config\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.348641 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c9396dc-f238-4d68-8306-5f22dc8abfcb-proxy-ca-bundles\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.349515 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f2bd50b-7099-4417-8986-0bbee1c1a413-client-ca\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.349839 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2bd50b-7099-4417-8986-0bbee1c1a413-config\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.350004 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c9396dc-f238-4d68-8306-5f22dc8abfcb-proxy-ca-bundles\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.350678 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c9396dc-f238-4d68-8306-5f22dc8abfcb-client-ca\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.350771 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c9396dc-f238-4d68-8306-5f22dc8abfcb-config\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.352902 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c9396dc-f238-4d68-8306-5f22dc8abfcb-serving-cert\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.353472 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f2bd50b-7099-4417-8986-0bbee1c1a413-serving-cert\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.363361 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5vhh\" (UniqueName: \"kubernetes.io/projected/0f2bd50b-7099-4417-8986-0bbee1c1a413-kube-api-access-q5vhh\") pod \"route-controller-manager-c5d756cbc-nbd5p\" (UID: \"0f2bd50b-7099-4417-8986-0bbee1c1a413\") " pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.373636 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7lgh\" (UniqueName: \"kubernetes.io/projected/3c9396dc-f238-4d68-8306-5f22dc8abfcb-kube-api-access-c7lgh\") pod \"controller-manager-75f89bf7fc-mhz8n\" (UID: \"3c9396dc-f238-4d68-8306-5f22dc8abfcb\") " pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.534166 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.593921 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.621263 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" event={"ID":"7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0","Type":"ContainerDied","Data":"57af629b83a507f7e48a05b5b1463b9ba05221b342670d448c5dd4cebda19fb8"} Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.621373 4848 scope.go:117] "RemoveContainer" containerID="bc54ea22bc6629e8969efd9b906344c1ab9dcef200e5983a0980adddf06c20b3" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.621413 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f9447cbb-4lpjs" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.668488 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66f9447cbb-4lpjs"] Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.674923 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66f9447cbb-4lpjs"] Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.981518 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0" path="/var/lib/kubelet/pods/7d522f2a-31d4-4dbd-aa4d-d74c6dd958f0/volumes" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.982497 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d670fa7b-0bd4-48ea-ba7e-a8b796afb72c" path="/var/lib/kubelet/pods/d670fa7b-0bd4-48ea-ba7e-a8b796afb72c/volumes" Dec 06 15:34:06 crc kubenswrapper[4848]: I1206 15:34:06.983118 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n"] Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.047587 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l7h77"] Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.048227 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.072007 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l7h77"] Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.100651 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p"] Dec 06 15:34:07 crc kubenswrapper[4848]: W1206 15:34:07.135872 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f2bd50b_7099_4417_8986_0bbee1c1a413.slice/crio-b91f4b4a0f4a0a3c8de7b733953b15931182d3f7ba42dfc8c9b2e89e7464606b WatchSource:0}: Error finding container b91f4b4a0f4a0a3c8de7b733953b15931182d3f7ba42dfc8c9b2e89e7464606b: Status 404 returned error can't find the container with id b91f4b4a0f4a0a3c8de7b733953b15931182d3f7ba42dfc8c9b2e89e7464606b Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.173751 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57b67cfe-d69c-47f7-9f75-151d362d5452-registry-certificates\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.173802 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpns7\" (UniqueName: \"kubernetes.io/projected/57b67cfe-d69c-47f7-9f75-151d362d5452-kube-api-access-kpns7\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.173828 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b67cfe-d69c-47f7-9f75-151d362d5452-trusted-ca\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.173850 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57b67cfe-d69c-47f7-9f75-151d362d5452-bound-sa-token\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.173880 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57b67cfe-d69c-47f7-9f75-151d362d5452-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.173909 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57b67cfe-d69c-47f7-9f75-151d362d5452-registry-tls\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.174038 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.174065 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57b67cfe-d69c-47f7-9f75-151d362d5452-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.201969 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.275363 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57b67cfe-d69c-47f7-9f75-151d362d5452-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.275429 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57b67cfe-d69c-47f7-9f75-151d362d5452-registry-tls\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.275472 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57b67cfe-d69c-47f7-9f75-151d362d5452-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.275502 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57b67cfe-d69c-47f7-9f75-151d362d5452-registry-certificates\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.275521 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpns7\" (UniqueName: \"kubernetes.io/projected/57b67cfe-d69c-47f7-9f75-151d362d5452-kube-api-access-kpns7\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.275541 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b67cfe-d69c-47f7-9f75-151d362d5452-trusted-ca\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.275562 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57b67cfe-d69c-47f7-9f75-151d362d5452-bound-sa-token\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.276001 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57b67cfe-d69c-47f7-9f75-151d362d5452-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.277013 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b67cfe-d69c-47f7-9f75-151d362d5452-trusted-ca\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.277060 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57b67cfe-d69c-47f7-9f75-151d362d5452-registry-certificates\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.280899 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57b67cfe-d69c-47f7-9f75-151d362d5452-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.291240 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57b67cfe-d69c-47f7-9f75-151d362d5452-registry-tls\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.295716 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpns7\" (UniqueName: \"kubernetes.io/projected/57b67cfe-d69c-47f7-9f75-151d362d5452-kube-api-access-kpns7\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.299191 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57b67cfe-d69c-47f7-9f75-151d362d5452-bound-sa-token\") pod \"image-registry-66df7c8f76-l7h77\" (UID: \"57b67cfe-d69c-47f7-9f75-151d362d5452\") " pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.429567 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.627533 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" event={"ID":"0f2bd50b-7099-4417-8986-0bbee1c1a413","Type":"ContainerStarted","Data":"75d873c3bb19adf5f413f90897bf7931c641228c0ebb300678a9bee45656e9c5"} Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.627847 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" event={"ID":"0f2bd50b-7099-4417-8986-0bbee1c1a413","Type":"ContainerStarted","Data":"b91f4b4a0f4a0a3c8de7b733953b15931182d3f7ba42dfc8c9b2e89e7464606b"} Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.628128 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.633846 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" event={"ID":"3c9396dc-f238-4d68-8306-5f22dc8abfcb","Type":"ContainerStarted","Data":"9b9a864f9131618802473a12de8e769577f578e8bf4a502529bfebe699e86f37"} Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.633887 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" event={"ID":"3c9396dc-f238-4d68-8306-5f22dc8abfcb","Type":"ContainerStarted","Data":"6e374ed278bfdd8c6a635dc861dd94efdcc2b6765bdf66d410aa896fe3868578"} Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.634633 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.639278 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.650484 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" podStartSLOduration=2.650467312 podStartE2EDuration="2.650467312s" podCreationTimestamp="2025-12-06 15:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:34:07.648620852 +0000 UTC m=+314.946631765" watchObservedRunningTime="2025-12-06 15:34:07.650467312 +0000 UTC m=+314.948478215" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.675635 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75f89bf7fc-mhz8n" podStartSLOduration=3.67561668 podStartE2EDuration="3.67561668s" podCreationTimestamp="2025-12-06 15:34:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:34:07.6748757 +0000 UTC m=+314.972886643" watchObservedRunningTime="2025-12-06 15:34:07.67561668 +0000 UTC m=+314.973627593" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.779574 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c5d756cbc-nbd5p" Dec 06 15:34:07 crc kubenswrapper[4848]: I1206 15:34:07.884259 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l7h77"] Dec 06 15:34:07 crc kubenswrapper[4848]: W1206 15:34:07.889231 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b67cfe_d69c_47f7_9f75_151d362d5452.slice/crio-e842bdab36345641dcdb323eccdba6db73ce640f548c4aabbd9a7d618970f4e3 WatchSource:0}: Error finding container e842bdab36345641dcdb323eccdba6db73ce640f548c4aabbd9a7d618970f4e3: Status 404 returned error can't find the container with id e842bdab36345641dcdb323eccdba6db73ce640f548c4aabbd9a7d618970f4e3 Dec 06 15:34:08 crc kubenswrapper[4848]: I1206 15:34:08.640191 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" event={"ID":"57b67cfe-d69c-47f7-9f75-151d362d5452","Type":"ContainerStarted","Data":"6e1831fe9fb960593470aa32e983c3802d039480a34edf81d4c05645e9b528ed"} Dec 06 15:34:08 crc kubenswrapper[4848]: I1206 15:34:08.640369 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" event={"ID":"57b67cfe-d69c-47f7-9f75-151d362d5452","Type":"ContainerStarted","Data":"e842bdab36345641dcdb323eccdba6db73ce640f548c4aabbd9a7d618970f4e3"} Dec 06 15:34:08 crc kubenswrapper[4848]: I1206 15:34:08.659941 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" podStartSLOduration=1.659925959 podStartE2EDuration="1.659925959s" podCreationTimestamp="2025-12-06 15:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:34:08.656180268 +0000 UTC m=+315.954191181" watchObservedRunningTime="2025-12-06 15:34:08.659925959 +0000 UTC m=+315.957936872" Dec 06 15:34:09 crc kubenswrapper[4848]: I1206 15:34:09.645107 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.679113 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fgcg8"] Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.679943 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fgcg8" podUID="c93d43de-aa47-4357-b786-aa586a35d462" containerName="registry-server" containerID="cri-o://1c733c7a7c28a715d0e3df0cfdcf9d42aff38981a19129f6d63437f58e201132" gracePeriod=30 Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.698129 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lgzqt"] Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.698529 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lgzqt" podUID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" containerName="registry-server" containerID="cri-o://39c0cf19568dd424cbba8ba20145e648cc341c996340c8e82536a493de414236" gracePeriod=30 Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.703832 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5gkxw"] Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.707891 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" podUID="6056241e-bb6d-420b-9808-b9b3803a3c2d" containerName="marketplace-operator" containerID="cri-o://e307124f47fd7190b563ddefe4afea84fe09bd8ae53bc2e0e4e7095c3b82acab" gracePeriod=30 Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.720571 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdjbm"] Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.720890 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bdjbm" podUID="7029868d-509f-479f-a237-45715e8114e2" containerName="registry-server" containerID="cri-o://c770863ec9677c862486bafb90a6d1fe6c0e3220967a74539a731c86abdb6c1f" gracePeriod=30 Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.731885 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zgk8b"] Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.732626 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.741400 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmm2g"] Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.741628 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hmm2g" podUID="2913f1de-4117-470b-b1de-a876051a131c" containerName="registry-server" containerID="cri-o://32d8f945220f7c927c972df3dc87b0e9cc2b757235e3a12596ef4a919599e594" gracePeriod=30 Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.748295 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zgk8b"] Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.775615 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c0af646-ef02-4709-9e31-cd29cd07fa4a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zgk8b\" (UID: \"6c0af646-ef02-4709-9e31-cd29cd07fa4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.775680 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvrhg\" (UniqueName: \"kubernetes.io/projected/6c0af646-ef02-4709-9e31-cd29cd07fa4a-kube-api-access-fvrhg\") pod \"marketplace-operator-79b997595-zgk8b\" (UID: \"6c0af646-ef02-4709-9e31-cd29cd07fa4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.775881 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0af646-ef02-4709-9e31-cd29cd07fa4a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zgk8b\" (UID: \"6c0af646-ef02-4709-9e31-cd29cd07fa4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.878200 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0af646-ef02-4709-9e31-cd29cd07fa4a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zgk8b\" (UID: \"6c0af646-ef02-4709-9e31-cd29cd07fa4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.878284 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c0af646-ef02-4709-9e31-cd29cd07fa4a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zgk8b\" (UID: \"6c0af646-ef02-4709-9e31-cd29cd07fa4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.878329 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvrhg\" (UniqueName: \"kubernetes.io/projected/6c0af646-ef02-4709-9e31-cd29cd07fa4a-kube-api-access-fvrhg\") pod \"marketplace-operator-79b997595-zgk8b\" (UID: \"6c0af646-ef02-4709-9e31-cd29cd07fa4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.881121 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0af646-ef02-4709-9e31-cd29cd07fa4a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zgk8b\" (UID: \"6c0af646-ef02-4709-9e31-cd29cd07fa4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.886579 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c0af646-ef02-4709-9e31-cd29cd07fa4a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zgk8b\" (UID: \"6c0af646-ef02-4709-9e31-cd29cd07fa4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:14 crc kubenswrapper[4848]: I1206 15:34:14.905509 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvrhg\" (UniqueName: \"kubernetes.io/projected/6c0af646-ef02-4709-9e31-cd29cd07fa4a-kube-api-access-fvrhg\") pod \"marketplace-operator-79b997595-zgk8b\" (UID: \"6c0af646-ef02-4709-9e31-cd29cd07fa4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.052093 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.477610 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zgk8b"] Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.677385 4848 generic.go:334] "Generic (PLEG): container finished" podID="6056241e-bb6d-420b-9808-b9b3803a3c2d" containerID="e307124f47fd7190b563ddefe4afea84fe09bd8ae53bc2e0e4e7095c3b82acab" exitCode=0 Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.677472 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" event={"ID":"6056241e-bb6d-420b-9808-b9b3803a3c2d","Type":"ContainerDied","Data":"e307124f47fd7190b563ddefe4afea84fe09bd8ae53bc2e0e4e7095c3b82acab"} Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.677898 4848 scope.go:117] "RemoveContainer" containerID="4076a09d59ac0d5d7055a84db2c7974c7145f95532fbe10b74659660bf761617" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.680837 4848 generic.go:334] "Generic (PLEG): container finished" podID="2913f1de-4117-470b-b1de-a876051a131c" containerID="32d8f945220f7c927c972df3dc87b0e9cc2b757235e3a12596ef4a919599e594" exitCode=0 Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.680907 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmm2g" event={"ID":"2913f1de-4117-470b-b1de-a876051a131c","Type":"ContainerDied","Data":"32d8f945220f7c927c972df3dc87b0e9cc2b757235e3a12596ef4a919599e594"} Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.685108 4848 generic.go:334] "Generic (PLEG): container finished" podID="c93d43de-aa47-4357-b786-aa586a35d462" containerID="1c733c7a7c28a715d0e3df0cfdcf9d42aff38981a19129f6d63437f58e201132" exitCode=0 Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.685168 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgcg8" event={"ID":"c93d43de-aa47-4357-b786-aa586a35d462","Type":"ContainerDied","Data":"1c733c7a7c28a715d0e3df0cfdcf9d42aff38981a19129f6d63437f58e201132"} Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.692054 4848 generic.go:334] "Generic (PLEG): container finished" podID="7029868d-509f-479f-a237-45715e8114e2" containerID="c770863ec9677c862486bafb90a6d1fe6c0e3220967a74539a731c86abdb6c1f" exitCode=0 Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.692179 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdjbm" event={"ID":"7029868d-509f-479f-a237-45715e8114e2","Type":"ContainerDied","Data":"c770863ec9677c862486bafb90a6d1fe6c0e3220967a74539a731c86abdb6c1f"} Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.695123 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" event={"ID":"6c0af646-ef02-4709-9e31-cd29cd07fa4a","Type":"ContainerStarted","Data":"844cc8d4653c6ad52270d40e2087cc00657f2a800a92cd89e76b5adbe6eb92aa"} Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.695167 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" event={"ID":"6c0af646-ef02-4709-9e31-cd29cd07fa4a","Type":"ContainerStarted","Data":"99db8bc6fdbeb4836fd5e6a0da44d08b5fd8fa18e329724702266dd82a26bc4d"} Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.696630 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.698952 4848 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zgk8b container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.67:8080/healthz\": dial tcp 10.217.0.67:8080: connect: connection refused" start-of-body= Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.699009 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" podUID="6c0af646-ef02-4709-9e31-cd29cd07fa4a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.67:8080/healthz\": dial tcp 10.217.0.67:8080: connect: connection refused" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.701361 4848 generic.go:334] "Generic (PLEG): container finished" podID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" containerID="39c0cf19568dd424cbba8ba20145e648cc341c996340c8e82536a493de414236" exitCode=0 Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.701406 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzqt" event={"ID":"9aba59c4-d9e7-444b-9620-29a26fa4c9fb","Type":"ContainerDied","Data":"39c0cf19568dd424cbba8ba20145e648cc341c996340c8e82536a493de414236"} Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.720059 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" podStartSLOduration=1.720039866 podStartE2EDuration="1.720039866s" podCreationTimestamp="2025-12-06 15:34:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:34:15.719203494 +0000 UTC m=+323.017214407" watchObservedRunningTime="2025-12-06 15:34:15.720039866 +0000 UTC m=+323.018050779" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.728670 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.795141 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-utilities\") pod \"c93d43de-aa47-4357-b786-aa586a35d462\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.795200 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tfq5\" (UniqueName: \"kubernetes.io/projected/c93d43de-aa47-4357-b786-aa586a35d462-kube-api-access-2tfq5\") pod \"c93d43de-aa47-4357-b786-aa586a35d462\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.795231 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-catalog-content\") pod \"c93d43de-aa47-4357-b786-aa586a35d462\" (UID: \"c93d43de-aa47-4357-b786-aa586a35d462\") " Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.797093 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-utilities" (OuterVolumeSpecName: "utilities") pod "c93d43de-aa47-4357-b786-aa586a35d462" (UID: "c93d43de-aa47-4357-b786-aa586a35d462"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.806454 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c93d43de-aa47-4357-b786-aa586a35d462-kube-api-access-2tfq5" (OuterVolumeSpecName: "kube-api-access-2tfq5") pod "c93d43de-aa47-4357-b786-aa586a35d462" (UID: "c93d43de-aa47-4357-b786-aa586a35d462"). InnerVolumeSpecName "kube-api-access-2tfq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.851204 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c93d43de-aa47-4357-b786-aa586a35d462" (UID: "c93d43de-aa47-4357-b786-aa586a35d462"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.896364 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.896396 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tfq5\" (UniqueName: \"kubernetes.io/projected/c93d43de-aa47-4357-b786-aa586a35d462-kube-api-access-2tfq5\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:15 crc kubenswrapper[4848]: I1206 15:34:15.896406 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c93d43de-aa47-4357-b786-aa586a35d462-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.033825 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.080179 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.086554 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.100190 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-utilities\") pod \"2913f1de-4117-470b-b1de-a876051a131c\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.100247 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-catalog-content\") pod \"2913f1de-4117-470b-b1de-a876051a131c\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.100299 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc6vn\" (UniqueName: \"kubernetes.io/projected/2913f1de-4117-470b-b1de-a876051a131c-kube-api-access-bc6vn\") pod \"2913f1de-4117-470b-b1de-a876051a131c\" (UID: \"2913f1de-4117-470b-b1de-a876051a131c\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.101854 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-utilities" (OuterVolumeSpecName: "utilities") pod "2913f1de-4117-470b-b1de-a876051a131c" (UID: "2913f1de-4117-470b-b1de-a876051a131c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.105585 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2913f1de-4117-470b-b1de-a876051a131c-kube-api-access-bc6vn" (OuterVolumeSpecName: "kube-api-access-bc6vn") pod "2913f1de-4117-470b-b1de-a876051a131c" (UID: "2913f1de-4117-470b-b1de-a876051a131c"). InnerVolumeSpecName "kube-api-access-bc6vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.108804 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.201302 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-operator-metrics\") pod \"6056241e-bb6d-420b-9808-b9b3803a3c2d\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.201359 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-utilities\") pod \"7029868d-509f-479f-a237-45715e8114e2\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.201400 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kndnf\" (UniqueName: \"kubernetes.io/projected/6056241e-bb6d-420b-9808-b9b3803a3c2d-kube-api-access-kndnf\") pod \"6056241e-bb6d-420b-9808-b9b3803a3c2d\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.201421 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znfpr\" (UniqueName: \"kubernetes.io/projected/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-kube-api-access-znfpr\") pod \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.201441 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-utilities\") pod \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.201469 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txw96\" (UniqueName: \"kubernetes.io/projected/7029868d-509f-479f-a237-45715e8114e2-kube-api-access-txw96\") pod \"7029868d-509f-479f-a237-45715e8114e2\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.201502 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-catalog-content\") pod \"7029868d-509f-479f-a237-45715e8114e2\" (UID: \"7029868d-509f-479f-a237-45715e8114e2\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.201541 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-catalog-content\") pod \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\" (UID: \"9aba59c4-d9e7-444b-9620-29a26fa4c9fb\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.201586 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-trusted-ca\") pod \"6056241e-bb6d-420b-9808-b9b3803a3c2d\" (UID: \"6056241e-bb6d-420b-9808-b9b3803a3c2d\") " Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.201825 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc6vn\" (UniqueName: \"kubernetes.io/projected/2913f1de-4117-470b-b1de-a876051a131c-kube-api-access-bc6vn\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.201838 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.202409 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6056241e-bb6d-420b-9808-b9b3803a3c2d" (UID: "6056241e-bb6d-420b-9808-b9b3803a3c2d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.203205 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-utilities" (OuterVolumeSpecName: "utilities") pod "7029868d-509f-479f-a237-45715e8114e2" (UID: "7029868d-509f-479f-a237-45715e8114e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.203411 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-utilities" (OuterVolumeSpecName: "utilities") pod "9aba59c4-d9e7-444b-9620-29a26fa4c9fb" (UID: "9aba59c4-d9e7-444b-9620-29a26fa4c9fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.205870 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7029868d-509f-479f-a237-45715e8114e2-kube-api-access-txw96" (OuterVolumeSpecName: "kube-api-access-txw96") pod "7029868d-509f-479f-a237-45715e8114e2" (UID: "7029868d-509f-479f-a237-45715e8114e2"). InnerVolumeSpecName "kube-api-access-txw96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.208828 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-kube-api-access-znfpr" (OuterVolumeSpecName: "kube-api-access-znfpr") pod "9aba59c4-d9e7-444b-9620-29a26fa4c9fb" (UID: "9aba59c4-d9e7-444b-9620-29a26fa4c9fb"). InnerVolumeSpecName "kube-api-access-znfpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.209125 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6056241e-bb6d-420b-9808-b9b3803a3c2d-kube-api-access-kndnf" (OuterVolumeSpecName: "kube-api-access-kndnf") pod "6056241e-bb6d-420b-9808-b9b3803a3c2d" (UID: "6056241e-bb6d-420b-9808-b9b3803a3c2d"). InnerVolumeSpecName "kube-api-access-kndnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.210542 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6056241e-bb6d-420b-9808-b9b3803a3c2d" (UID: "6056241e-bb6d-420b-9808-b9b3803a3c2d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.223070 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7029868d-509f-479f-a237-45715e8114e2" (UID: "7029868d-509f-479f-a237-45715e8114e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.237477 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2913f1de-4117-470b-b1de-a876051a131c" (UID: "2913f1de-4117-470b-b1de-a876051a131c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.260678 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9aba59c4-d9e7-444b-9620-29a26fa4c9fb" (UID: "9aba59c4-d9e7-444b-9620-29a26fa4c9fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.305030 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.305445 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.305576 4848 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.305653 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2913f1de-4117-470b-b1de-a876051a131c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.305751 4848 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6056241e-bb6d-420b-9808-b9b3803a3c2d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.305845 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7029868d-509f-479f-a237-45715e8114e2-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.305939 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kndnf\" (UniqueName: \"kubernetes.io/projected/6056241e-bb6d-420b-9808-b9b3803a3c2d-kube-api-access-kndnf\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.306032 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znfpr\" (UniqueName: \"kubernetes.io/projected/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-kube-api-access-znfpr\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.306101 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aba59c4-d9e7-444b-9620-29a26fa4c9fb-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.306167 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txw96\" (UniqueName: \"kubernetes.io/projected/7029868d-509f-479f-a237-45715e8114e2-kube-api-access-txw96\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.710713 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lgzqt" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.710689 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lgzqt" event={"ID":"9aba59c4-d9e7-444b-9620-29a26fa4c9fb","Type":"ContainerDied","Data":"87c6b6575b1e56cbf4852eeb17e1b3f7c67e2e55207fc5df0d38f9a76096cd50"} Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.710894 4848 scope.go:117] "RemoveContainer" containerID="39c0cf19568dd424cbba8ba20145e648cc341c996340c8e82536a493de414236" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.712332 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.712335 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5gkxw" event={"ID":"6056241e-bb6d-420b-9808-b9b3803a3c2d","Type":"ContainerDied","Data":"00f0b10bc96b73fed58068c9e54ab52c94d55b507b80c65a536df9fcd76f99bf"} Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.716755 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmm2g" event={"ID":"2913f1de-4117-470b-b1de-a876051a131c","Type":"ContainerDied","Data":"e3b76a0ce93fe699afca1077da7a34862443d46875c1b4aeb10bc3777ea36109"} Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.716876 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmm2g" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.719939 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgcg8" event={"ID":"c93d43de-aa47-4357-b786-aa586a35d462","Type":"ContainerDied","Data":"acb4de3c99a6a1dc885b8c531f6b21d7c0c30edbb3bb84a454c566c28eb58f07"} Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.720105 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgcg8" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.723202 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdjbm" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.723836 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdjbm" event={"ID":"7029868d-509f-479f-a237-45715e8114e2","Type":"ContainerDied","Data":"66fa250d2c7617d27b44d5f9743ceabdff769d89768e851dc40c15b298a988ff"} Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.729618 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zgk8b" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.732478 4848 scope.go:117] "RemoveContainer" containerID="8b410cb77833909364fb2f514e54a5b2582773a7defd5698288742f6b8eadc05" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.766901 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lgzqt"] Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.769547 4848 scope.go:117] "RemoveContainer" containerID="8cd6ee99a2c9aa7dabea46df0fa20b5d59acdafb4fa3555d1e9aa1f42913fcc8" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.775277 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lgzqt"] Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.785618 4848 scope.go:117] "RemoveContainer" containerID="e307124f47fd7190b563ddefe4afea84fe09bd8ae53bc2e0e4e7095c3b82acab" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.791788 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5gkxw"] Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.796161 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5gkxw"] Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.801341 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fgcg8"] Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.811055 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fgcg8"] Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.820612 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmm2g"] Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.827669 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hmm2g"] Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.827829 4848 scope.go:117] "RemoveContainer" containerID="32d8f945220f7c927c972df3dc87b0e9cc2b757235e3a12596ef4a919599e594" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.829560 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdjbm"] Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.833924 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdjbm"] Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.839640 4848 scope.go:117] "RemoveContainer" containerID="58844abeee291e13fc7a6a929a8f384a223ab709325de815c5d9112958469a72" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.853008 4848 scope.go:117] "RemoveContainer" containerID="01921c4e8b70059923fb7ad88000a9a5aa060169049d0fbe1788beec84574e4d" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.874877 4848 scope.go:117] "RemoveContainer" containerID="1c733c7a7c28a715d0e3df0cfdcf9d42aff38981a19129f6d63437f58e201132" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.894997 4848 scope.go:117] "RemoveContainer" containerID="1e410c4c10617d38176c56ecf0bd3a2d9c329c6a750d932eb5eb157c1fabcdd0" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.913487 4848 scope.go:117] "RemoveContainer" containerID="b919732718c6d723a61617006601c8be1d483479c4cee3de766a99408297713f" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.934866 4848 scope.go:117] "RemoveContainer" containerID="c770863ec9677c862486bafb90a6d1fe6c0e3220967a74539a731c86abdb6c1f" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.948671 4848 scope.go:117] "RemoveContainer" containerID="3a5ef5830117b70533447764d61b0371883b101b3eacfc5d9dee316d3f859df6" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.961085 4848 scope.go:117] "RemoveContainer" containerID="15a125363743de990e503b8665b93724b7c9158497614a869ff54a9c260489d0" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.971020 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2913f1de-4117-470b-b1de-a876051a131c" path="/var/lib/kubelet/pods/2913f1de-4117-470b-b1de-a876051a131c/volumes" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.971625 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6056241e-bb6d-420b-9808-b9b3803a3c2d" path="/var/lib/kubelet/pods/6056241e-bb6d-420b-9808-b9b3803a3c2d/volumes" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.972096 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7029868d-509f-479f-a237-45715e8114e2" path="/var/lib/kubelet/pods/7029868d-509f-479f-a237-45715e8114e2/volumes" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.973009 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" path="/var/lib/kubelet/pods/9aba59c4-d9e7-444b-9620-29a26fa4c9fb/volumes" Dec 06 15:34:16 crc kubenswrapper[4848]: I1206 15:34:16.973559 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c93d43de-aa47-4357-b786-aa586a35d462" path="/var/lib/kubelet/pods/c93d43de-aa47-4357-b786-aa586a35d462/volumes" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.153382 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.153481 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.953937 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k889q"] Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954481 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6056241e-bb6d-420b-9808-b9b3803a3c2d" containerName="marketplace-operator" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954498 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6056241e-bb6d-420b-9808-b9b3803a3c2d" containerName="marketplace-operator" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954507 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93d43de-aa47-4357-b786-aa586a35d462" containerName="extract-content" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954516 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93d43de-aa47-4357-b786-aa586a35d462" containerName="extract-content" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954530 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2913f1de-4117-470b-b1de-a876051a131c" containerName="extract-utilities" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954537 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2913f1de-4117-470b-b1de-a876051a131c" containerName="extract-utilities" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954548 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2913f1de-4117-470b-b1de-a876051a131c" containerName="extract-content" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954554 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2913f1de-4117-470b-b1de-a876051a131c" containerName="extract-content" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954567 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7029868d-509f-479f-a237-45715e8114e2" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954574 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7029868d-509f-479f-a237-45715e8114e2" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954588 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7029868d-509f-479f-a237-45715e8114e2" containerName="extract-content" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954596 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7029868d-509f-479f-a237-45715e8114e2" containerName="extract-content" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954645 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93d43de-aa47-4357-b786-aa586a35d462" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954654 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93d43de-aa47-4357-b786-aa586a35d462" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954661 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2913f1de-4117-470b-b1de-a876051a131c" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954669 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2913f1de-4117-470b-b1de-a876051a131c" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954679 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" containerName="extract-utilities" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954686 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" containerName="extract-utilities" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954710 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954718 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954728 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7029868d-509f-479f-a237-45715e8114e2" containerName="extract-utilities" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954735 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7029868d-509f-479f-a237-45715e8114e2" containerName="extract-utilities" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954745 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93d43de-aa47-4357-b786-aa586a35d462" containerName="extract-utilities" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954752 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93d43de-aa47-4357-b786-aa586a35d462" containerName="extract-utilities" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.954761 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" containerName="extract-content" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954771 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" containerName="extract-content" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954893 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aba59c4-d9e7-444b-9620-29a26fa4c9fb" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954909 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="6056241e-bb6d-420b-9808-b9b3803a3c2d" containerName="marketplace-operator" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954919 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c93d43de-aa47-4357-b786-aa586a35d462" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954927 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="6056241e-bb6d-420b-9808-b9b3803a3c2d" containerName="marketplace-operator" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.954989 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7029868d-509f-479f-a237-45715e8114e2" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.955003 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2913f1de-4117-470b-b1de-a876051a131c" containerName="registry-server" Dec 06 15:34:17 crc kubenswrapper[4848]: E1206 15:34:17.955179 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6056241e-bb6d-420b-9808-b9b3803a3c2d" containerName="marketplace-operator" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.955193 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6056241e-bb6d-420b-9808-b9b3803a3c2d" containerName="marketplace-operator" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.967694 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.969636 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 15:34:17 crc kubenswrapper[4848]: I1206 15:34:17.974476 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k889q"] Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.027844 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a35496c1-709b-4b72-8f68-c72b29e955ea-utilities\") pod \"certified-operators-k889q\" (UID: \"a35496c1-709b-4b72-8f68-c72b29e955ea\") " pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.028114 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a35496c1-709b-4b72-8f68-c72b29e955ea-catalog-content\") pod \"certified-operators-k889q\" (UID: \"a35496c1-709b-4b72-8f68-c72b29e955ea\") " pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.028264 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qch65\" (UniqueName: \"kubernetes.io/projected/a35496c1-709b-4b72-8f68-c72b29e955ea-kube-api-access-qch65\") pod \"certified-operators-k889q\" (UID: \"a35496c1-709b-4b72-8f68-c72b29e955ea\") " pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.129026 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a35496c1-709b-4b72-8f68-c72b29e955ea-utilities\") pod \"certified-operators-k889q\" (UID: \"a35496c1-709b-4b72-8f68-c72b29e955ea\") " pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.129076 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a35496c1-709b-4b72-8f68-c72b29e955ea-catalog-content\") pod \"certified-operators-k889q\" (UID: \"a35496c1-709b-4b72-8f68-c72b29e955ea\") " pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.129108 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qch65\" (UniqueName: \"kubernetes.io/projected/a35496c1-709b-4b72-8f68-c72b29e955ea-kube-api-access-qch65\") pod \"certified-operators-k889q\" (UID: \"a35496c1-709b-4b72-8f68-c72b29e955ea\") " pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.129932 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a35496c1-709b-4b72-8f68-c72b29e955ea-utilities\") pod \"certified-operators-k889q\" (UID: \"a35496c1-709b-4b72-8f68-c72b29e955ea\") " pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.130141 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a35496c1-709b-4b72-8f68-c72b29e955ea-catalog-content\") pod \"certified-operators-k889q\" (UID: \"a35496c1-709b-4b72-8f68-c72b29e955ea\") " pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.159689 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qch65\" (UniqueName: \"kubernetes.io/projected/a35496c1-709b-4b72-8f68-c72b29e955ea-kube-api-access-qch65\") pod \"certified-operators-k889q\" (UID: \"a35496c1-709b-4b72-8f68-c72b29e955ea\") " pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.292728 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.552391 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t7xkb"] Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.553861 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.606269 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.609948 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7xkb"] Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.634869 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11513b70-de83-43bd-a70f-fdcc3a6aa1da-utilities\") pod \"redhat-marketplace-t7xkb\" (UID: \"11513b70-de83-43bd-a70f-fdcc3a6aa1da\") " pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.634912 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phpl2\" (UniqueName: \"kubernetes.io/projected/11513b70-de83-43bd-a70f-fdcc3a6aa1da-kube-api-access-phpl2\") pod \"redhat-marketplace-t7xkb\" (UID: \"11513b70-de83-43bd-a70f-fdcc3a6aa1da\") " pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.634962 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11513b70-de83-43bd-a70f-fdcc3a6aa1da-catalog-content\") pod \"redhat-marketplace-t7xkb\" (UID: \"11513b70-de83-43bd-a70f-fdcc3a6aa1da\") " pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.662790 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k889q"] Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.735618 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11513b70-de83-43bd-a70f-fdcc3a6aa1da-catalog-content\") pod \"redhat-marketplace-t7xkb\" (UID: \"11513b70-de83-43bd-a70f-fdcc3a6aa1da\") " pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.735684 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11513b70-de83-43bd-a70f-fdcc3a6aa1da-utilities\") pod \"redhat-marketplace-t7xkb\" (UID: \"11513b70-de83-43bd-a70f-fdcc3a6aa1da\") " pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.735738 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phpl2\" (UniqueName: \"kubernetes.io/projected/11513b70-de83-43bd-a70f-fdcc3a6aa1da-kube-api-access-phpl2\") pod \"redhat-marketplace-t7xkb\" (UID: \"11513b70-de83-43bd-a70f-fdcc3a6aa1da\") " pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.736164 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11513b70-de83-43bd-a70f-fdcc3a6aa1da-catalog-content\") pod \"redhat-marketplace-t7xkb\" (UID: \"11513b70-de83-43bd-a70f-fdcc3a6aa1da\") " pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.736178 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11513b70-de83-43bd-a70f-fdcc3a6aa1da-utilities\") pod \"redhat-marketplace-t7xkb\" (UID: \"11513b70-de83-43bd-a70f-fdcc3a6aa1da\") " pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.739362 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k889q" event={"ID":"a35496c1-709b-4b72-8f68-c72b29e955ea","Type":"ContainerStarted","Data":"ce01ad9ce7de877c5d245b961844793adf72ed19fbb1f9ee03a8712902cf7672"} Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.754251 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phpl2\" (UniqueName: \"kubernetes.io/projected/11513b70-de83-43bd-a70f-fdcc3a6aa1da-kube-api-access-phpl2\") pod \"redhat-marketplace-t7xkb\" (UID: \"11513b70-de83-43bd-a70f-fdcc3a6aa1da\") " pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:18 crc kubenswrapper[4848]: I1206 15:34:18.922225 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:19 crc kubenswrapper[4848]: I1206 15:34:19.282279 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7xkb"] Dec 06 15:34:19 crc kubenswrapper[4848]: W1206 15:34:19.295395 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11513b70_de83_43bd_a70f_fdcc3a6aa1da.slice/crio-5925f255f00b214003e5fc9387ee60115229a69a7b47ab0db260262749229ee2 WatchSource:0}: Error finding container 5925f255f00b214003e5fc9387ee60115229a69a7b47ab0db260262749229ee2: Status 404 returned error can't find the container with id 5925f255f00b214003e5fc9387ee60115229a69a7b47ab0db260262749229ee2 Dec 06 15:34:19 crc kubenswrapper[4848]: I1206 15:34:19.745890 4848 generic.go:334] "Generic (PLEG): container finished" podID="11513b70-de83-43bd-a70f-fdcc3a6aa1da" containerID="5e27eea0518c87a5705523d89f918e09692d59725e5ee0b919b8786881b18778" exitCode=0 Dec 06 15:34:19 crc kubenswrapper[4848]: I1206 15:34:19.745977 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7xkb" event={"ID":"11513b70-de83-43bd-a70f-fdcc3a6aa1da","Type":"ContainerDied","Data":"5e27eea0518c87a5705523d89f918e09692d59725e5ee0b919b8786881b18778"} Dec 06 15:34:19 crc kubenswrapper[4848]: I1206 15:34:19.746009 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7xkb" event={"ID":"11513b70-de83-43bd-a70f-fdcc3a6aa1da","Type":"ContainerStarted","Data":"5925f255f00b214003e5fc9387ee60115229a69a7b47ab0db260262749229ee2"} Dec 06 15:34:19 crc kubenswrapper[4848]: I1206 15:34:19.747389 4848 generic.go:334] "Generic (PLEG): container finished" podID="a35496c1-709b-4b72-8f68-c72b29e955ea" containerID="8414b479828c840aeb1d08038cc12a90a2dd97c06097bbac8845186e3ab3162e" exitCode=0 Dec 06 15:34:19 crc kubenswrapper[4848]: I1206 15:34:19.747429 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k889q" event={"ID":"a35496c1-709b-4b72-8f68-c72b29e955ea","Type":"ContainerDied","Data":"8414b479828c840aeb1d08038cc12a90a2dd97c06097bbac8845186e3ab3162e"} Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.352486 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-47f72"] Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.353595 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.357028 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.362169 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47f72"] Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.469657 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95984d31-57cc-4d15-b0da-208b3bba0cfc-utilities\") pod \"redhat-operators-47f72\" (UID: \"95984d31-57cc-4d15-b0da-208b3bba0cfc\") " pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.469936 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nfdt\" (UniqueName: \"kubernetes.io/projected/95984d31-57cc-4d15-b0da-208b3bba0cfc-kube-api-access-6nfdt\") pod \"redhat-operators-47f72\" (UID: \"95984d31-57cc-4d15-b0da-208b3bba0cfc\") " pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.470033 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95984d31-57cc-4d15-b0da-208b3bba0cfc-catalog-content\") pod \"redhat-operators-47f72\" (UID: \"95984d31-57cc-4d15-b0da-208b3bba0cfc\") " pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.571244 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nfdt\" (UniqueName: \"kubernetes.io/projected/95984d31-57cc-4d15-b0da-208b3bba0cfc-kube-api-access-6nfdt\") pod \"redhat-operators-47f72\" (UID: \"95984d31-57cc-4d15-b0da-208b3bba0cfc\") " pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.571471 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95984d31-57cc-4d15-b0da-208b3bba0cfc-catalog-content\") pod \"redhat-operators-47f72\" (UID: \"95984d31-57cc-4d15-b0da-208b3bba0cfc\") " pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.572251 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95984d31-57cc-4d15-b0da-208b3bba0cfc-catalog-content\") pod \"redhat-operators-47f72\" (UID: \"95984d31-57cc-4d15-b0da-208b3bba0cfc\") " pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.572538 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95984d31-57cc-4d15-b0da-208b3bba0cfc-utilities\") pod \"redhat-operators-47f72\" (UID: \"95984d31-57cc-4d15-b0da-208b3bba0cfc\") " pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.572858 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95984d31-57cc-4d15-b0da-208b3bba0cfc-utilities\") pod \"redhat-operators-47f72\" (UID: \"95984d31-57cc-4d15-b0da-208b3bba0cfc\") " pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.591107 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nfdt\" (UniqueName: \"kubernetes.io/projected/95984d31-57cc-4d15-b0da-208b3bba0cfc-kube-api-access-6nfdt\") pod \"redhat-operators-47f72\" (UID: \"95984d31-57cc-4d15-b0da-208b3bba0cfc\") " pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.683346 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.769093 4848 generic.go:334] "Generic (PLEG): container finished" podID="11513b70-de83-43bd-a70f-fdcc3a6aa1da" containerID="66332222c2186d5b046db46ec0b733f2b04b47cd15d525c523b8c2f2587ce08f" exitCode=0 Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.769191 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7xkb" event={"ID":"11513b70-de83-43bd-a70f-fdcc3a6aa1da","Type":"ContainerDied","Data":"66332222c2186d5b046db46ec0b733f2b04b47cd15d525c523b8c2f2587ce08f"} Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.820186 4848 generic.go:334] "Generic (PLEG): container finished" podID="a35496c1-709b-4b72-8f68-c72b29e955ea" containerID="b1ae3c834218dfc27484658fec016dd49808448e78be92c1b811ed9d439a8375" exitCode=0 Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.820247 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k889q" event={"ID":"a35496c1-709b-4b72-8f68-c72b29e955ea","Type":"ContainerDied","Data":"b1ae3c834218dfc27484658fec016dd49808448e78be92c1b811ed9d439a8375"} Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.957858 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dvh52"] Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.959224 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.960878 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dvh52"] Dec 06 15:34:20 crc kubenswrapper[4848]: I1206 15:34:20.960971 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.079377 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5245db24-208d-47cf-9d64-d62b203292e2-catalog-content\") pod \"community-operators-dvh52\" (UID: \"5245db24-208d-47cf-9d64-d62b203292e2\") " pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.079430 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5245db24-208d-47cf-9d64-d62b203292e2-utilities\") pod \"community-operators-dvh52\" (UID: \"5245db24-208d-47cf-9d64-d62b203292e2\") " pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.079464 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp6mv\" (UniqueName: \"kubernetes.io/projected/5245db24-208d-47cf-9d64-d62b203292e2-kube-api-access-sp6mv\") pod \"community-operators-dvh52\" (UID: \"5245db24-208d-47cf-9d64-d62b203292e2\") " pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.118783 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47f72"] Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.181082 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5245db24-208d-47cf-9d64-d62b203292e2-catalog-content\") pod \"community-operators-dvh52\" (UID: \"5245db24-208d-47cf-9d64-d62b203292e2\") " pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.181135 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5245db24-208d-47cf-9d64-d62b203292e2-utilities\") pod \"community-operators-dvh52\" (UID: \"5245db24-208d-47cf-9d64-d62b203292e2\") " pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.181187 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp6mv\" (UniqueName: \"kubernetes.io/projected/5245db24-208d-47cf-9d64-d62b203292e2-kube-api-access-sp6mv\") pod \"community-operators-dvh52\" (UID: \"5245db24-208d-47cf-9d64-d62b203292e2\") " pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.182012 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5245db24-208d-47cf-9d64-d62b203292e2-utilities\") pod \"community-operators-dvh52\" (UID: \"5245db24-208d-47cf-9d64-d62b203292e2\") " pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.182056 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5245db24-208d-47cf-9d64-d62b203292e2-catalog-content\") pod \"community-operators-dvh52\" (UID: \"5245db24-208d-47cf-9d64-d62b203292e2\") " pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.201848 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp6mv\" (UniqueName: \"kubernetes.io/projected/5245db24-208d-47cf-9d64-d62b203292e2-kube-api-access-sp6mv\") pod \"community-operators-dvh52\" (UID: \"5245db24-208d-47cf-9d64-d62b203292e2\") " pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.281002 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.677522 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dvh52"] Dec 06 15:34:21 crc kubenswrapper[4848]: W1206 15:34:21.683487 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5245db24_208d_47cf_9d64_d62b203292e2.slice/crio-cf9d5cca81f00f4230605eaa77864332c9a72632d70d48a55d6b4d1d22a5b03b WatchSource:0}: Error finding container cf9d5cca81f00f4230605eaa77864332c9a72632d70d48a55d6b4d1d22a5b03b: Status 404 returned error can't find the container with id cf9d5cca81f00f4230605eaa77864332c9a72632d70d48a55d6b4d1d22a5b03b Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.828486 4848 generic.go:334] "Generic (PLEG): container finished" podID="95984d31-57cc-4d15-b0da-208b3bba0cfc" containerID="52e96102acb60688b9491ef04c831fa7000e760f52bbea29a001ee0dc25b47b5" exitCode=0 Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.828727 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47f72" event={"ID":"95984d31-57cc-4d15-b0da-208b3bba0cfc","Type":"ContainerDied","Data":"52e96102acb60688b9491ef04c831fa7000e760f52bbea29a001ee0dc25b47b5"} Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.828857 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47f72" event={"ID":"95984d31-57cc-4d15-b0da-208b3bba0cfc","Type":"ContainerStarted","Data":"a1c9544be31bcb837638fe32f3b771dd813c139aa91ab44c143fc1e619a653e7"} Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.834981 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7xkb" event={"ID":"11513b70-de83-43bd-a70f-fdcc3a6aa1da","Type":"ContainerStarted","Data":"2f33c68fb7a237f3f65fc466f25c7b9a0d188161dff91f86a7ec5ceb09d34bf7"} Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.855309 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k889q" event={"ID":"a35496c1-709b-4b72-8f68-c72b29e955ea","Type":"ContainerStarted","Data":"02278a45b4fa6919c5d619b0ac13897a4f9aba06d2392bbc792e0bbf1b57ab18"} Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.859200 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dvh52" event={"ID":"5245db24-208d-47cf-9d64-d62b203292e2","Type":"ContainerStarted","Data":"ce52e893d99192f52bb0da937573319382844bd8ebf361cf60be121a3f52accb"} Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.859243 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dvh52" event={"ID":"5245db24-208d-47cf-9d64-d62b203292e2","Type":"ContainerStarted","Data":"cf9d5cca81f00f4230605eaa77864332c9a72632d70d48a55d6b4d1d22a5b03b"} Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.885138 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t7xkb" podStartSLOduration=2.440059561 podStartE2EDuration="3.885118482s" podCreationTimestamp="2025-12-06 15:34:18 +0000 UTC" firstStartedPulling="2025-12-06 15:34:19.747902668 +0000 UTC m=+327.045913581" lastFinishedPulling="2025-12-06 15:34:21.192961589 +0000 UTC m=+328.490972502" observedRunningTime="2025-12-06 15:34:21.866620501 +0000 UTC m=+329.164631424" watchObservedRunningTime="2025-12-06 15:34:21.885118482 +0000 UTC m=+329.183129395" Dec 06 15:34:21 crc kubenswrapper[4848]: I1206 15:34:21.902417 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k889q" podStartSLOduration=3.409304429 podStartE2EDuration="4.90239602s" podCreationTimestamp="2025-12-06 15:34:17 +0000 UTC" firstStartedPulling="2025-12-06 15:34:19.748491094 +0000 UTC m=+327.046502007" lastFinishedPulling="2025-12-06 15:34:21.241582695 +0000 UTC m=+328.539593598" observedRunningTime="2025-12-06 15:34:21.898023121 +0000 UTC m=+329.196034044" watchObservedRunningTime="2025-12-06 15:34:21.90239602 +0000 UTC m=+329.200406943" Dec 06 15:34:22 crc kubenswrapper[4848]: I1206 15:34:22.866414 4848 generic.go:334] "Generic (PLEG): container finished" podID="5245db24-208d-47cf-9d64-d62b203292e2" containerID="ce52e893d99192f52bb0da937573319382844bd8ebf361cf60be121a3f52accb" exitCode=0 Dec 06 15:34:22 crc kubenswrapper[4848]: I1206 15:34:22.866823 4848 generic.go:334] "Generic (PLEG): container finished" podID="5245db24-208d-47cf-9d64-d62b203292e2" containerID="50edb0e76c8e84aa8609a0f4aa7d8393230960d9c59eee91ed666bca3d7bd54b" exitCode=0 Dec 06 15:34:22 crc kubenswrapper[4848]: I1206 15:34:22.866503 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dvh52" event={"ID":"5245db24-208d-47cf-9d64-d62b203292e2","Type":"ContainerDied","Data":"ce52e893d99192f52bb0da937573319382844bd8ebf361cf60be121a3f52accb"} Dec 06 15:34:22 crc kubenswrapper[4848]: I1206 15:34:22.866879 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dvh52" event={"ID":"5245db24-208d-47cf-9d64-d62b203292e2","Type":"ContainerDied","Data":"50edb0e76c8e84aa8609a0f4aa7d8393230960d9c59eee91ed666bca3d7bd54b"} Dec 06 15:34:22 crc kubenswrapper[4848]: I1206 15:34:22.868806 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47f72" event={"ID":"95984d31-57cc-4d15-b0da-208b3bba0cfc","Type":"ContainerStarted","Data":"673807eef5608880a4f37ccd487e466479f52a441fbdcf75a29b813081e4f2ce"} Dec 06 15:34:23 crc kubenswrapper[4848]: I1206 15:34:23.874370 4848 generic.go:334] "Generic (PLEG): container finished" podID="95984d31-57cc-4d15-b0da-208b3bba0cfc" containerID="673807eef5608880a4f37ccd487e466479f52a441fbdcf75a29b813081e4f2ce" exitCode=0 Dec 06 15:34:23 crc kubenswrapper[4848]: I1206 15:34:23.874419 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47f72" event={"ID":"95984d31-57cc-4d15-b0da-208b3bba0cfc","Type":"ContainerDied","Data":"673807eef5608880a4f37ccd487e466479f52a441fbdcf75a29b813081e4f2ce"} Dec 06 15:34:24 crc kubenswrapper[4848]: I1206 15:34:24.882070 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dvh52" event={"ID":"5245db24-208d-47cf-9d64-d62b203292e2","Type":"ContainerStarted","Data":"e45234930afc9c73b557b2f6d5099285b6cd15cb0a04db9722e375f028bf2d16"} Dec 06 15:34:24 crc kubenswrapper[4848]: I1206 15:34:24.886531 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47f72" event={"ID":"95984d31-57cc-4d15-b0da-208b3bba0cfc","Type":"ContainerStarted","Data":"35bd7e86382dc7bfd3092620ad36bd2bd32c98144d6308d11351df8b7490d086"} Dec 06 15:34:24 crc kubenswrapper[4848]: I1206 15:34:24.900336 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dvh52" podStartSLOduration=3.294568048 podStartE2EDuration="4.900318666s" podCreationTimestamp="2025-12-06 15:34:20 +0000 UTC" firstStartedPulling="2025-12-06 15:34:21.860422044 +0000 UTC m=+329.158432957" lastFinishedPulling="2025-12-06 15:34:23.466172662 +0000 UTC m=+330.764183575" observedRunningTime="2025-12-06 15:34:24.897985842 +0000 UTC m=+332.195996785" watchObservedRunningTime="2025-12-06 15:34:24.900318666 +0000 UTC m=+332.198329579" Dec 06 15:34:24 crc kubenswrapper[4848]: I1206 15:34:24.919791 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-47f72" podStartSLOduration=2.335575774 podStartE2EDuration="4.919773173s" podCreationTimestamp="2025-12-06 15:34:20 +0000 UTC" firstStartedPulling="2025-12-06 15:34:21.829880327 +0000 UTC m=+329.127891240" lastFinishedPulling="2025-12-06 15:34:24.414077736 +0000 UTC m=+331.712088639" observedRunningTime="2025-12-06 15:34:24.916642488 +0000 UTC m=+332.214653421" watchObservedRunningTime="2025-12-06 15:34:24.919773173 +0000 UTC m=+332.217784086" Dec 06 15:34:27 crc kubenswrapper[4848]: I1206 15:34:27.435941 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-l7h77" Dec 06 15:34:27 crc kubenswrapper[4848]: I1206 15:34:27.481543 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56mp5"] Dec 06 15:34:28 crc kubenswrapper[4848]: I1206 15:34:28.293685 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:28 crc kubenswrapper[4848]: I1206 15:34:28.294195 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:28 crc kubenswrapper[4848]: I1206 15:34:28.331685 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:28 crc kubenswrapper[4848]: I1206 15:34:28.923347 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:28 crc kubenswrapper[4848]: I1206 15:34:28.923748 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:28 crc kubenswrapper[4848]: I1206 15:34:28.946032 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k889q" Dec 06 15:34:28 crc kubenswrapper[4848]: I1206 15:34:28.973297 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:29 crc kubenswrapper[4848]: I1206 15:34:29.948975 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t7xkb" Dec 06 15:34:30 crc kubenswrapper[4848]: I1206 15:34:30.684528 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:30 crc kubenswrapper[4848]: I1206 15:34:30.684563 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:30 crc kubenswrapper[4848]: I1206 15:34:30.719072 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:30 crc kubenswrapper[4848]: I1206 15:34:30.957863 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-47f72" Dec 06 15:34:31 crc kubenswrapper[4848]: I1206 15:34:31.281323 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:31 crc kubenswrapper[4848]: I1206 15:34:31.281365 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:31 crc kubenswrapper[4848]: I1206 15:34:31.314827 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:31 crc kubenswrapper[4848]: I1206 15:34:31.951957 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dvh52" Dec 06 15:34:47 crc kubenswrapper[4848]: I1206 15:34:47.149980 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:34:47 crc kubenswrapper[4848]: I1206 15:34:47.150299 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:34:52 crc kubenswrapper[4848]: I1206 15:34:52.516131 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" podUID="f8e41ef6-75a7-4af2-94b0-14ef0274122a" containerName="registry" containerID="cri-o://8265df00fe4aca945806dea57fbf6818dca40e9e1706848e043daca7d2925df4" gracePeriod=30 Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.042319 4848 generic.go:334] "Generic (PLEG): container finished" podID="f8e41ef6-75a7-4af2-94b0-14ef0274122a" containerID="8265df00fe4aca945806dea57fbf6818dca40e9e1706848e043daca7d2925df4" exitCode=0 Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.042430 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" event={"ID":"f8e41ef6-75a7-4af2-94b0-14ef0274122a","Type":"ContainerDied","Data":"8265df00fe4aca945806dea57fbf6818dca40e9e1706848e043daca7d2925df4"} Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.584051 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.752636 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8e41ef6-75a7-4af2-94b0-14ef0274122a-ca-trust-extracted\") pod \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.752724 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-bound-sa-token\") pod \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.752777 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8e41ef6-75a7-4af2-94b0-14ef0274122a-installation-pull-secrets\") pod \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.752826 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2xxd\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-kube-api-access-w2xxd\") pod \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.752980 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.753036 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-tls\") pod \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.753078 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-trusted-ca\") pod \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.753120 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-certificates\") pod \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\" (UID: \"f8e41ef6-75a7-4af2-94b0-14ef0274122a\") " Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.754180 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f8e41ef6-75a7-4af2-94b0-14ef0274122a" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.754781 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f8e41ef6-75a7-4af2-94b0-14ef0274122a" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.760270 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e41ef6-75a7-4af2-94b0-14ef0274122a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f8e41ef6-75a7-4af2-94b0-14ef0274122a" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.760455 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f8e41ef6-75a7-4af2-94b0-14ef0274122a" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.760871 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-kube-api-access-w2xxd" (OuterVolumeSpecName: "kube-api-access-w2xxd") pod "f8e41ef6-75a7-4af2-94b0-14ef0274122a" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a"). InnerVolumeSpecName "kube-api-access-w2xxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.761558 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f8e41ef6-75a7-4af2-94b0-14ef0274122a" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.762514 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f8e41ef6-75a7-4af2-94b0-14ef0274122a" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.777857 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e41ef6-75a7-4af2-94b0-14ef0274122a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f8e41ef6-75a7-4af2-94b0-14ef0274122a" (UID: "f8e41ef6-75a7-4af2-94b0-14ef0274122a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.855010 4848 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.855045 4848 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8e41ef6-75a7-4af2-94b0-14ef0274122a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.855057 4848 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.855067 4848 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8e41ef6-75a7-4af2-94b0-14ef0274122a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.855075 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2xxd\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-kube-api-access-w2xxd\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.855083 4848 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8e41ef6-75a7-4af2-94b0-14ef0274122a-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:55 crc kubenswrapper[4848]: I1206 15:34:55.855091 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8e41ef6-75a7-4af2-94b0-14ef0274122a-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:34:56 crc kubenswrapper[4848]: I1206 15:34:56.049074 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" event={"ID":"f8e41ef6-75a7-4af2-94b0-14ef0274122a","Type":"ContainerDied","Data":"21cc8369ffc620dd2bd1be492d7304fe0d925eb2319ed13dd69f5ead0040fe4a"} Dec 06 15:34:56 crc kubenswrapper[4848]: I1206 15:34:56.049128 4848 scope.go:117] "RemoveContainer" containerID="8265df00fe4aca945806dea57fbf6818dca40e9e1706848e043daca7d2925df4" Dec 06 15:34:56 crc kubenswrapper[4848]: I1206 15:34:56.049136 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-56mp5" Dec 06 15:34:56 crc kubenswrapper[4848]: I1206 15:34:56.083912 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56mp5"] Dec 06 15:34:56 crc kubenswrapper[4848]: I1206 15:34:56.088555 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56mp5"] Dec 06 15:34:56 crc kubenswrapper[4848]: I1206 15:34:56.975099 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e41ef6-75a7-4af2-94b0-14ef0274122a" path="/var/lib/kubelet/pods/f8e41ef6-75a7-4af2-94b0-14ef0274122a/volumes" Dec 06 15:35:17 crc kubenswrapper[4848]: I1206 15:35:17.150598 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:35:17 crc kubenswrapper[4848]: I1206 15:35:17.151182 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:35:17 crc kubenswrapper[4848]: I1206 15:35:17.151237 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:35:17 crc kubenswrapper[4848]: I1206 15:35:17.151871 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c99f1328e0668dc9b260317ced6462d308dfaafbbba65376203fa8ba91f7d72"} pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 15:35:17 crc kubenswrapper[4848]: I1206 15:35:17.151943 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" containerID="cri-o://0c99f1328e0668dc9b260317ced6462d308dfaafbbba65376203fa8ba91f7d72" gracePeriod=600 Dec 06 15:35:18 crc kubenswrapper[4848]: I1206 15:35:18.178550 4848 generic.go:334] "Generic (PLEG): container finished" podID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerID="0c99f1328e0668dc9b260317ced6462d308dfaafbbba65376203fa8ba91f7d72" exitCode=0 Dec 06 15:35:18 crc kubenswrapper[4848]: I1206 15:35:18.178610 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerDied","Data":"0c99f1328e0668dc9b260317ced6462d308dfaafbbba65376203fa8ba91f7d72"} Dec 06 15:35:18 crc kubenswrapper[4848]: I1206 15:35:18.179406 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"dffb056b5b0c944f6ca07e0173a6a84cf532fc36395779266ccdf117768e05dc"} Dec 06 15:35:18 crc kubenswrapper[4848]: I1206 15:35:18.179452 4848 scope.go:117] "RemoveContainer" containerID="0aa06378f0635b1e9d0e2c89d666c4469222b8a1e6738898bec897670bfe90ff" Dec 06 15:36:53 crc kubenswrapper[4848]: I1206 15:36:53.210955 4848 scope.go:117] "RemoveContainer" containerID="16d268e8a8e4ae1224ad387cdb7056858b1b33d9cd0c95b59d0aff3373f91f04" Dec 06 15:37:17 crc kubenswrapper[4848]: I1206 15:37:17.149919 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:37:17 crc kubenswrapper[4848]: I1206 15:37:17.150595 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:37:47 crc kubenswrapper[4848]: I1206 15:37:47.149906 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:37:47 crc kubenswrapper[4848]: I1206 15:37:47.150383 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:37:53 crc kubenswrapper[4848]: I1206 15:37:53.245268 4848 scope.go:117] "RemoveContainer" containerID="547698fb856285f38fa2f13bab647a1a199611d0eec7e5e56ee21a5dbe468fe7" Dec 06 15:37:53 crc kubenswrapper[4848]: I1206 15:37:53.263001 4848 scope.go:117] "RemoveContainer" containerID="55b90ab8fe210c7d33bcf88cf98df2033048e3df8c6d0fe700980af1ea0e6529" Dec 06 15:38:17 crc kubenswrapper[4848]: I1206 15:38:17.150545 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:38:17 crc kubenswrapper[4848]: I1206 15:38:17.151187 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:38:17 crc kubenswrapper[4848]: I1206 15:38:17.151252 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:38:17 crc kubenswrapper[4848]: I1206 15:38:17.151880 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dffb056b5b0c944f6ca07e0173a6a84cf532fc36395779266ccdf117768e05dc"} pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 15:38:17 crc kubenswrapper[4848]: I1206 15:38:17.151945 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" containerID="cri-o://dffb056b5b0c944f6ca07e0173a6a84cf532fc36395779266ccdf117768e05dc" gracePeriod=600 Dec 06 15:38:18 crc kubenswrapper[4848]: I1206 15:38:18.281417 4848 generic.go:334] "Generic (PLEG): container finished" podID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerID="dffb056b5b0c944f6ca07e0173a6a84cf532fc36395779266ccdf117768e05dc" exitCode=0 Dec 06 15:38:18 crc kubenswrapper[4848]: I1206 15:38:18.281587 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerDied","Data":"dffb056b5b0c944f6ca07e0173a6a84cf532fc36395779266ccdf117768e05dc"} Dec 06 15:38:18 crc kubenswrapper[4848]: I1206 15:38:18.281913 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"e63fbea36a0e0bb825a9969be1380c579a7bc59e5ffe70c4e6a4da495e1853d8"} Dec 06 15:38:18 crc kubenswrapper[4848]: I1206 15:38:18.281946 4848 scope.go:117] "RemoveContainer" containerID="0c99f1328e0668dc9b260317ced6462d308dfaafbbba65376203fa8ba91f7d72" Dec 06 15:39:53 crc kubenswrapper[4848]: I1206 15:39:53.306558 4848 scope.go:117] "RemoveContainer" containerID="92a0adbd814cfd3073e60613776f32a98c4df36bfc1682913cc2eb0ade57cdf9" Dec 06 15:40:17 crc kubenswrapper[4848]: I1206 15:40:17.150797 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:40:17 crc kubenswrapper[4848]: I1206 15:40:17.151458 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:40:35 crc kubenswrapper[4848]: I1206 15:40:35.489762 4848 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 15:40:47 crc kubenswrapper[4848]: I1206 15:40:47.149933 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:40:47 crc kubenswrapper[4848]: I1206 15:40:47.150222 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.048184 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zlwlz"] Dec 06 15:40:59 crc kubenswrapper[4848]: E1206 15:40:59.049051 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e41ef6-75a7-4af2-94b0-14ef0274122a" containerName="registry" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.049069 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e41ef6-75a7-4af2-94b0-14ef0274122a" containerName="registry" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.049201 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e41ef6-75a7-4af2-94b0-14ef0274122a" containerName="registry" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.049644 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-zlwlz" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.051641 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.052725 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.053452 4848 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-mt429" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.065853 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-q9bg5"] Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.066654 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-q9bg5" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.070060 4848 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qbrzg" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.082592 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pwsfx"] Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.083396 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-pwsfx" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.088312 4848 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-gw2nz" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.092270 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zlwlz"] Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.103161 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pwsfx"] Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.109390 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-q9bg5"] Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.191921 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8dmz\" (UniqueName: \"kubernetes.io/projected/be7d17e4-0e3a-4a56-9a2f-d8a7ab9b0960-kube-api-access-b8dmz\") pod \"cert-manager-5b446d88c5-q9bg5\" (UID: \"be7d17e4-0e3a-4a56-9a2f-d8a7ab9b0960\") " pod="cert-manager/cert-manager-5b446d88c5-q9bg5" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.191972 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rl7s\" (UniqueName: \"kubernetes.io/projected/96eace79-3285-4aef-902d-aed98f97663c-kube-api-access-4rl7s\") pod \"cert-manager-cainjector-7f985d654d-zlwlz\" (UID: \"96eace79-3285-4aef-902d-aed98f97663c\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zlwlz" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.192022 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84mqn\" (UniqueName: \"kubernetes.io/projected/bd79a30b-a387-4c71-9415-5d8c8a20cd63-kube-api-access-84mqn\") pod \"cert-manager-webhook-5655c58dd6-pwsfx\" (UID: \"bd79a30b-a387-4c71-9415-5d8c8a20cd63\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pwsfx" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.293599 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8dmz\" (UniqueName: \"kubernetes.io/projected/be7d17e4-0e3a-4a56-9a2f-d8a7ab9b0960-kube-api-access-b8dmz\") pod \"cert-manager-5b446d88c5-q9bg5\" (UID: \"be7d17e4-0e3a-4a56-9a2f-d8a7ab9b0960\") " pod="cert-manager/cert-manager-5b446d88c5-q9bg5" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.293653 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rl7s\" (UniqueName: \"kubernetes.io/projected/96eace79-3285-4aef-902d-aed98f97663c-kube-api-access-4rl7s\") pod \"cert-manager-cainjector-7f985d654d-zlwlz\" (UID: \"96eace79-3285-4aef-902d-aed98f97663c\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zlwlz" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.293680 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84mqn\" (UniqueName: \"kubernetes.io/projected/bd79a30b-a387-4c71-9415-5d8c8a20cd63-kube-api-access-84mqn\") pod \"cert-manager-webhook-5655c58dd6-pwsfx\" (UID: \"bd79a30b-a387-4c71-9415-5d8c8a20cd63\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pwsfx" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.316748 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rl7s\" (UniqueName: \"kubernetes.io/projected/96eace79-3285-4aef-902d-aed98f97663c-kube-api-access-4rl7s\") pod \"cert-manager-cainjector-7f985d654d-zlwlz\" (UID: \"96eace79-3285-4aef-902d-aed98f97663c\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zlwlz" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.319270 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8dmz\" (UniqueName: \"kubernetes.io/projected/be7d17e4-0e3a-4a56-9a2f-d8a7ab9b0960-kube-api-access-b8dmz\") pod \"cert-manager-5b446d88c5-q9bg5\" (UID: \"be7d17e4-0e3a-4a56-9a2f-d8a7ab9b0960\") " pod="cert-manager/cert-manager-5b446d88c5-q9bg5" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.319679 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84mqn\" (UniqueName: \"kubernetes.io/projected/bd79a30b-a387-4c71-9415-5d8c8a20cd63-kube-api-access-84mqn\") pod \"cert-manager-webhook-5655c58dd6-pwsfx\" (UID: \"bd79a30b-a387-4c71-9415-5d8c8a20cd63\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pwsfx" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.372750 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-zlwlz" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.388644 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-q9bg5" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.401349 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-pwsfx" Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.802234 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zlwlz"] Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.805108 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-q9bg5"] Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.809557 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 15:40:59 crc kubenswrapper[4848]: I1206 15:40:59.852790 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pwsfx"] Dec 06 15:41:00 crc kubenswrapper[4848]: I1206 15:41:00.213116 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-q9bg5" event={"ID":"be7d17e4-0e3a-4a56-9a2f-d8a7ab9b0960","Type":"ContainerStarted","Data":"f645f93224fcdf1badea829de74d00ab03805b8e73279f4382834a6fc27ea6e8"} Dec 06 15:41:00 crc kubenswrapper[4848]: I1206 15:41:00.214218 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-pwsfx" event={"ID":"bd79a30b-a387-4c71-9415-5d8c8a20cd63","Type":"ContainerStarted","Data":"19d1c972df25d77519c336740d7674d49dae998c73b04f80b056952927cc03fc"} Dec 06 15:41:00 crc kubenswrapper[4848]: I1206 15:41:00.215208 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-zlwlz" event={"ID":"96eace79-3285-4aef-902d-aed98f97663c","Type":"ContainerStarted","Data":"2ac5b85cc5303e170a4744e12e78f66458ce6a14f8d47417cb7d097da00aa646"} Dec 06 15:41:04 crc kubenswrapper[4848]: I1206 15:41:04.246314 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-q9bg5" event={"ID":"be7d17e4-0e3a-4a56-9a2f-d8a7ab9b0960","Type":"ContainerStarted","Data":"cb9582f0526e096b20656a3377a64fdf64054091510fa20d8c589ac706a6030c"} Dec 06 15:41:04 crc kubenswrapper[4848]: I1206 15:41:04.249135 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-pwsfx" event={"ID":"bd79a30b-a387-4c71-9415-5d8c8a20cd63","Type":"ContainerStarted","Data":"3d729ef283a5b558c79046b7f86eb0b23da9b10cf889f5a5800623977c5f6f7f"} Dec 06 15:41:04 crc kubenswrapper[4848]: I1206 15:41:04.249339 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-pwsfx" Dec 06 15:41:04 crc kubenswrapper[4848]: I1206 15:41:04.251238 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-zlwlz" event={"ID":"96eace79-3285-4aef-902d-aed98f97663c","Type":"ContainerStarted","Data":"878306fcaf5086c6c7dad219b586afc7c680c6d4c22e421d608ad1664aa5eee2"} Dec 06 15:41:04 crc kubenswrapper[4848]: I1206 15:41:04.283948 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-q9bg5" podStartSLOduration=1.278843409 podStartE2EDuration="5.283931148s" podCreationTimestamp="2025-12-06 15:40:59 +0000 UTC" firstStartedPulling="2025-12-06 15:40:59.810837466 +0000 UTC m=+727.108848399" lastFinishedPulling="2025-12-06 15:41:03.815925225 +0000 UTC m=+731.113936138" observedRunningTime="2025-12-06 15:41:04.267074932 +0000 UTC m=+731.565085855" watchObservedRunningTime="2025-12-06 15:41:04.283931148 +0000 UTC m=+731.581942071" Dec 06 15:41:04 crc kubenswrapper[4848]: I1206 15:41:04.286208 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-zlwlz" podStartSLOduration=1.284659685 podStartE2EDuration="5.286200029s" podCreationTimestamp="2025-12-06 15:40:59 +0000 UTC" firstStartedPulling="2025-12-06 15:40:59.809271743 +0000 UTC m=+727.107282676" lastFinishedPulling="2025-12-06 15:41:03.810812097 +0000 UTC m=+731.108823020" observedRunningTime="2025-12-06 15:41:04.281867102 +0000 UTC m=+731.579878035" watchObservedRunningTime="2025-12-06 15:41:04.286200029 +0000 UTC m=+731.584210952" Dec 06 15:41:04 crc kubenswrapper[4848]: I1206 15:41:04.307186 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-pwsfx" podStartSLOduration=1.35016229 podStartE2EDuration="5.307168237s" podCreationTimestamp="2025-12-06 15:40:59 +0000 UTC" firstStartedPulling="2025-12-06 15:40:59.859428522 +0000 UTC m=+727.157439435" lastFinishedPulling="2025-12-06 15:41:03.816434469 +0000 UTC m=+731.114445382" observedRunningTime="2025-12-06 15:41:04.303757865 +0000 UTC m=+731.601768788" watchObservedRunningTime="2025-12-06 15:41:04.307168237 +0000 UTC m=+731.605179150" Dec 06 15:41:08 crc kubenswrapper[4848]: I1206 15:41:08.930219 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8g4jc"] Dec 06 15:41:08 crc kubenswrapper[4848]: I1206 15:41:08.931044 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovn-controller" containerID="cri-o://05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67" gracePeriod=30 Dec 06 15:41:08 crc kubenswrapper[4848]: I1206 15:41:08.931110 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="nbdb" containerID="cri-o://1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64" gracePeriod=30 Dec 06 15:41:08 crc kubenswrapper[4848]: I1206 15:41:08.931181 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="kube-rbac-proxy-node" containerID="cri-o://6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9" gracePeriod=30 Dec 06 15:41:08 crc kubenswrapper[4848]: I1206 15:41:08.931183 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="northd" containerID="cri-o://3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9" gracePeriod=30 Dec 06 15:41:08 crc kubenswrapper[4848]: I1206 15:41:08.931238 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovn-acl-logging" containerID="cri-o://1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88" gracePeriod=30 Dec 06 15:41:08 crc kubenswrapper[4848]: I1206 15:41:08.931317 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="sbdb" containerID="cri-o://da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659" gracePeriod=30 Dec 06 15:41:08 crc kubenswrapper[4848]: I1206 15:41:08.931426 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc" gracePeriod=30 Dec 06 15:41:08 crc kubenswrapper[4848]: I1206 15:41:08.969619 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" containerID="cri-o://149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24" gracePeriod=30 Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.203300 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/2.log" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.205281 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovn-acl-logging/0.log" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.205772 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovn-controller/0.log" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.206198 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.253970 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j9n72"] Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254192 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254211 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254226 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="kubecfg-setup" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254234 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="kubecfg-setup" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254249 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="kube-rbac-proxy-node" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254258 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="kube-rbac-proxy-node" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254270 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254278 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254288 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254296 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254309 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovn-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254317 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovn-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254328 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254335 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254345 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="sbdb" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254354 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="sbdb" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254364 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254372 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254381 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="nbdb" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254389 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="nbdb" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254399 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="northd" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254406 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="northd" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.254420 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovn-acl-logging" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254428 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovn-acl-logging" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254549 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254562 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovn-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254577 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254586 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="northd" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254598 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254607 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254635 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="nbdb" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254647 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovn-acl-logging" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254659 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="kube-rbac-proxy-node" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254669 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="sbdb" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.254921 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerName="ovnkube-controller" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.256921 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.289630 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qx6m8_9c409d16-f97d-4bcd-bf25-b80af1b16922/kube-multus/1.log" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.290291 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qx6m8_9c409d16-f97d-4bcd-bf25-b80af1b16922/kube-multus/0.log" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.290332 4848 generic.go:334] "Generic (PLEG): container finished" podID="9c409d16-f97d-4bcd-bf25-b80af1b16922" containerID="04bd10779b2e6d35c9c3deb96ec020ab03381619f7bc56bc994363a684bee55b" exitCode=2 Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.290382 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qx6m8" event={"ID":"9c409d16-f97d-4bcd-bf25-b80af1b16922","Type":"ContainerDied","Data":"04bd10779b2e6d35c9c3deb96ec020ab03381619f7bc56bc994363a684bee55b"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.290415 4848 scope.go:117] "RemoveContainer" containerID="56a10d50c32a199fb67d3f92cac03a0a31b2c347f27d68d91370db2281409541" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.291148 4848 scope.go:117] "RemoveContainer" containerID="04bd10779b2e6d35c9c3deb96ec020ab03381619f7bc56bc994363a684bee55b" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.293599 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovnkube-controller/2.log" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.298232 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovn-acl-logging/0.log" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.298691 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8g4jc_9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/ovn-controller/0.log" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.298945 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24" exitCode=0 Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.298968 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659" exitCode=0 Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.298975 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64" exitCode=0 Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.298982 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9" exitCode=0 Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.298990 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc" exitCode=0 Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.298997 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9" exitCode=0 Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299004 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88" exitCode=143 Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299011 4848 generic.go:334] "Generic (PLEG): container finished" podID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" containerID="05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67" exitCode=143 Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299031 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299055 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299066 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299077 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299088 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299097 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299108 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299117 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299123 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299128 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299133 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299138 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299143 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299148 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299153 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299159 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299165 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299172 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299178 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299183 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299188 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299193 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299198 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299203 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299208 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299213 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299217 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299224 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299231 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299237 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299241 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299246 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299251 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299255 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299260 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299264 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299269 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299276 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299283 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" event={"ID":"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135","Type":"ContainerDied","Data":"19da850c4e2a55d8aad805fee1f96c94045408809464b9ba54f7dd2e1ccf068d"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299290 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299296 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299301 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299307 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299312 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299317 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299322 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299328 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299333 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299339 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8"} Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.299415 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8g4jc" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.321644 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovn-node-metrics-cert\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322009 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-ovn\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322074 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-script-lib\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322117 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-netns\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322135 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322209 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322237 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322158 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-ovn-kubernetes\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322302 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-openvswitch\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322335 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-node-log\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322368 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-var-lib-openvswitch\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322400 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-slash\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322411 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-node-log" (OuterVolumeSpecName: "node-log") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322416 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322435 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322445 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-config\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322458 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-slash" (OuterVolumeSpecName: "host-slash") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322492 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-kubelet\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322523 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-log-socket\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322550 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322591 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322574 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-etc-openvswitch\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322623 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-log-socket" (OuterVolumeSpecName: "log-socket") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322629 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322651 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-systemd-units\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322677 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-bin\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322716 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-env-overrides\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322737 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-netd\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322755 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zznnm\" (UniqueName: \"kubernetes.io/projected/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-kube-api-access-zznnm\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322771 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322802 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-var-lib-cni-networks-ovn-kubernetes\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322819 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-systemd\") pod \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\" (UID: \"9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135\") " Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322816 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322864 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322962 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bcb47cae-1fef-44d4-b295-23e504c9ea99-ovnkube-config\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.322984 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-run-netns\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323032 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-kubelet\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323067 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-var-lib-openvswitch\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323087 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bcb47cae-1fef-44d4-b295-23e504c9ea99-ovn-node-metrics-cert\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323112 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-run-openvswitch\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323140 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-systemd-units\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323163 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-slash\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323187 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-run-ovn-kubernetes\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323208 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlxpw\" (UniqueName: \"kubernetes.io/projected/bcb47cae-1fef-44d4-b295-23e504c9ea99-kube-api-access-tlxpw\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323237 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-node-log\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323029 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323255 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bcb47cae-1fef-44d4-b295-23e504c9ea99-env-overrides\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323298 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-run-ovn\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323318 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-run-systemd\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323332 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-cni-bin\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323352 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-cni-netd\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323367 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bcb47cae-1fef-44d4-b295-23e504c9ea99-ovnkube-script-lib\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323393 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-log-socket\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323425 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-etc-openvswitch\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323461 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323293 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323661 4848 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323778 4848 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-log-socket\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323798 4848 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323808 4848 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323818 4848 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323828 4848 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323861 4848 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323871 4848 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323899 4848 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323907 4848 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323915 4848 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323923 4848 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323932 4848 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-node-log\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323939 4848 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-host-slash\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.323967 4848 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.324125 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.327345 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.329425 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-kube-api-access-zznnm" (OuterVolumeSpecName: "kube-api-access-zznnm") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "kube-api-access-zznnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.335684 4848 scope.go:117] "RemoveContainer" containerID="149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.339737 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" (UID: "9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.365090 4848 scope.go:117] "RemoveContainer" containerID="ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.390758 4848 scope.go:117] "RemoveContainer" containerID="da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.404175 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-pwsfx" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.407826 4848 scope.go:117] "RemoveContainer" containerID="1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.425708 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bcb47cae-1fef-44d4-b295-23e504c9ea99-env-overrides\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.425791 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-run-ovn\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.425852 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-run-systemd\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.425880 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-cni-bin\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.425908 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-run-ovn\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.425983 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-cni-netd\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426214 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bcb47cae-1fef-44d4-b295-23e504c9ea99-ovnkube-script-lib\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426015 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-cni-bin\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426154 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-cni-netd\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426068 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-run-systemd\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426340 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-log-socket\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426312 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-log-socket\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426423 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-etc-openvswitch\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426445 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bcb47cae-1fef-44d4-b295-23e504c9ea99-env-overrides\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426452 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426515 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-etc-openvswitch\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426559 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bcb47cae-1fef-44d4-b295-23e504c9ea99-ovnkube-config\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426574 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426582 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-run-netns\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426667 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-kubelet\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426724 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-kubelet\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426738 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-var-lib-openvswitch\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426781 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bcb47cae-1fef-44d4-b295-23e504c9ea99-ovn-node-metrics-cert\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426817 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-run-openvswitch\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426830 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-var-lib-openvswitch\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426865 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-systemd-units\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426907 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-slash\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426945 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-run-ovn-kubernetes\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426949 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-systemd-units\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426967 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-run-openvswitch\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426977 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlxpw\" (UniqueName: \"kubernetes.io/projected/bcb47cae-1fef-44d4-b295-23e504c9ea99-kube-api-access-tlxpw\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.426990 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-slash\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.427002 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-run-ovn-kubernetes\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.427039 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-node-log\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.427134 4848 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.427145 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-node-log\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.427150 4848 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.427166 4848 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.427182 4848 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.427200 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zznnm\" (UniqueName: \"kubernetes.io/projected/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135-kube-api-access-zznnm\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.427360 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bcb47cae-1fef-44d4-b295-23e504c9ea99-host-run-netns\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.427429 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bcb47cae-1fef-44d4-b295-23e504c9ea99-ovnkube-config\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.428945 4848 scope.go:117] "RemoveContainer" containerID="3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.429260 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bcb47cae-1fef-44d4-b295-23e504c9ea99-ovnkube-script-lib\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.430380 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bcb47cae-1fef-44d4-b295-23e504c9ea99-ovn-node-metrics-cert\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.443920 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlxpw\" (UniqueName: \"kubernetes.io/projected/bcb47cae-1fef-44d4-b295-23e504c9ea99-kube-api-access-tlxpw\") pod \"ovnkube-node-j9n72\" (UID: \"bcb47cae-1fef-44d4-b295-23e504c9ea99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.445874 4848 scope.go:117] "RemoveContainer" containerID="14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.458967 4848 scope.go:117] "RemoveContainer" containerID="6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.470571 4848 scope.go:117] "RemoveContainer" containerID="1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.486356 4848 scope.go:117] "RemoveContainer" containerID="05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.497933 4848 scope.go:117] "RemoveContainer" containerID="3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.511896 4848 scope.go:117] "RemoveContainer" containerID="149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.512283 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24\": container with ID starting with 149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24 not found: ID does not exist" containerID="149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.512324 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24"} err="failed to get container status \"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24\": rpc error: code = NotFound desc = could not find container \"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24\": container with ID starting with 149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.512352 4848 scope.go:117] "RemoveContainer" containerID="ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.512608 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\": container with ID starting with ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71 not found: ID does not exist" containerID="ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.512635 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71"} err="failed to get container status \"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\": rpc error: code = NotFound desc = could not find container \"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\": container with ID starting with ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.512652 4848 scope.go:117] "RemoveContainer" containerID="da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.512973 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\": container with ID starting with da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659 not found: ID does not exist" containerID="da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.512999 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659"} err="failed to get container status \"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\": rpc error: code = NotFound desc = could not find container \"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\": container with ID starting with da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.513016 4848 scope.go:117] "RemoveContainer" containerID="1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.513209 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\": container with ID starting with 1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64 not found: ID does not exist" containerID="1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.513237 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64"} err="failed to get container status \"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\": rpc error: code = NotFound desc = could not find container \"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\": container with ID starting with 1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.513255 4848 scope.go:117] "RemoveContainer" containerID="3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.513532 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\": container with ID starting with 3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9 not found: ID does not exist" containerID="3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.513559 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9"} err="failed to get container status \"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\": rpc error: code = NotFound desc = could not find container \"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\": container with ID starting with 3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.513582 4848 scope.go:117] "RemoveContainer" containerID="14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.513896 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\": container with ID starting with 14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc not found: ID does not exist" containerID="14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.513925 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc"} err="failed to get container status \"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\": rpc error: code = NotFound desc = could not find container \"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\": container with ID starting with 14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.513942 4848 scope.go:117] "RemoveContainer" containerID="6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.514276 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\": container with ID starting with 6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9 not found: ID does not exist" containerID="6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.514305 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9"} err="failed to get container status \"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\": rpc error: code = NotFound desc = could not find container \"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\": container with ID starting with 6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.514324 4848 scope.go:117] "RemoveContainer" containerID="1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.514588 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\": container with ID starting with 1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88 not found: ID does not exist" containerID="1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.514617 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88"} err="failed to get container status \"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\": rpc error: code = NotFound desc = could not find container \"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\": container with ID starting with 1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.514635 4848 scope.go:117] "RemoveContainer" containerID="05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.515276 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\": container with ID starting with 05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67 not found: ID does not exist" containerID="05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.515329 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67"} err="failed to get container status \"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\": rpc error: code = NotFound desc = could not find container \"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\": container with ID starting with 05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.515356 4848 scope.go:117] "RemoveContainer" containerID="3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8" Dec 06 15:41:09 crc kubenswrapper[4848]: E1206 15:41:09.515756 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\": container with ID starting with 3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8 not found: ID does not exist" containerID="3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.515808 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8"} err="failed to get container status \"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\": rpc error: code = NotFound desc = could not find container \"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\": container with ID starting with 3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.515830 4848 scope.go:117] "RemoveContainer" containerID="149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.516112 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24"} err="failed to get container status \"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24\": rpc error: code = NotFound desc = could not find container \"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24\": container with ID starting with 149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.516136 4848 scope.go:117] "RemoveContainer" containerID="ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.516593 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71"} err="failed to get container status \"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\": rpc error: code = NotFound desc = could not find container \"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\": container with ID starting with ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.516617 4848 scope.go:117] "RemoveContainer" containerID="da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.516906 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659"} err="failed to get container status \"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\": rpc error: code = NotFound desc = could not find container \"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\": container with ID starting with da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.516929 4848 scope.go:117] "RemoveContainer" containerID="1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.517267 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64"} err="failed to get container status \"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\": rpc error: code = NotFound desc = could not find container \"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\": container with ID starting with 1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.517294 4848 scope.go:117] "RemoveContainer" containerID="3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.517649 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9"} err="failed to get container status \"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\": rpc error: code = NotFound desc = could not find container \"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\": container with ID starting with 3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.517674 4848 scope.go:117] "RemoveContainer" containerID="14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.517913 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc"} err="failed to get container status \"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\": rpc error: code = NotFound desc = could not find container \"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\": container with ID starting with 14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.517940 4848 scope.go:117] "RemoveContainer" containerID="6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.518123 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9"} err="failed to get container status \"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\": rpc error: code = NotFound desc = could not find container \"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\": container with ID starting with 6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.518146 4848 scope.go:117] "RemoveContainer" containerID="1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.518373 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88"} err="failed to get container status \"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\": rpc error: code = NotFound desc = could not find container \"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\": container with ID starting with 1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.518408 4848 scope.go:117] "RemoveContainer" containerID="05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.518919 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67"} err="failed to get container status \"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\": rpc error: code = NotFound desc = could not find container \"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\": container with ID starting with 05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.518948 4848 scope.go:117] "RemoveContainer" containerID="3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.519193 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8"} err="failed to get container status \"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\": rpc error: code = NotFound desc = could not find container \"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\": container with ID starting with 3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.519219 4848 scope.go:117] "RemoveContainer" containerID="149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.519411 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24"} err="failed to get container status \"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24\": rpc error: code = NotFound desc = could not find container \"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24\": container with ID starting with 149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.519441 4848 scope.go:117] "RemoveContainer" containerID="ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.519705 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71"} err="failed to get container status \"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\": rpc error: code = NotFound desc = could not find container \"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\": container with ID starting with ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.519730 4848 scope.go:117] "RemoveContainer" containerID="da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.519983 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659"} err="failed to get container status \"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\": rpc error: code = NotFound desc = could not find container \"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\": container with ID starting with da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.520013 4848 scope.go:117] "RemoveContainer" containerID="1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.520543 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64"} err="failed to get container status \"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\": rpc error: code = NotFound desc = could not find container \"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\": container with ID starting with 1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.520571 4848 scope.go:117] "RemoveContainer" containerID="3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.520823 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9"} err="failed to get container status \"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\": rpc error: code = NotFound desc = could not find container \"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\": container with ID starting with 3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.520852 4848 scope.go:117] "RemoveContainer" containerID="14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.521381 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc"} err="failed to get container status \"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\": rpc error: code = NotFound desc = could not find container \"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\": container with ID starting with 14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.521406 4848 scope.go:117] "RemoveContainer" containerID="6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.521659 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9"} err="failed to get container status \"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\": rpc error: code = NotFound desc = could not find container \"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\": container with ID starting with 6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.521685 4848 scope.go:117] "RemoveContainer" containerID="1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.522014 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88"} err="failed to get container status \"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\": rpc error: code = NotFound desc = could not find container \"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\": container with ID starting with 1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.522039 4848 scope.go:117] "RemoveContainer" containerID="05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.522300 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67"} err="failed to get container status \"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\": rpc error: code = NotFound desc = could not find container \"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\": container with ID starting with 05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.522332 4848 scope.go:117] "RemoveContainer" containerID="3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.522574 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8"} err="failed to get container status \"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\": rpc error: code = NotFound desc = could not find container \"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\": container with ID starting with 3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.522600 4848 scope.go:117] "RemoveContainer" containerID="149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.523576 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24"} err="failed to get container status \"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24\": rpc error: code = NotFound desc = could not find container \"149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24\": container with ID starting with 149a052b268b7c6833b2dca2637ee4c6fde5b792a84db397e64f792a962f6c24 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.523613 4848 scope.go:117] "RemoveContainer" containerID="ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.524090 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71"} err="failed to get container status \"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\": rpc error: code = NotFound desc = could not find container \"ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71\": container with ID starting with ae092072b4a242c2f043f3b9fe8c4c22cb29afb00b21ca3b8321b1a52336ba71 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.524117 4848 scope.go:117] "RemoveContainer" containerID="da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.524386 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659"} err="failed to get container status \"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\": rpc error: code = NotFound desc = could not find container \"da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659\": container with ID starting with da0f5b06edaa6b25096becfc12f4905d3b2f410b0ed67a8f840aa62bb7059659 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.524411 4848 scope.go:117] "RemoveContainer" containerID="1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.524678 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64"} err="failed to get container status \"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\": rpc error: code = NotFound desc = could not find container \"1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64\": container with ID starting with 1cc6bd5e6b5ed5fd658359559ac1f0003f09c5beaecadc2f2712e5751786ae64 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.524717 4848 scope.go:117] "RemoveContainer" containerID="3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.524955 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9"} err="failed to get container status \"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\": rpc error: code = NotFound desc = could not find container \"3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9\": container with ID starting with 3caf9e1646d88f41abf30949c79be24e2e6cb239fe87656e679e7815d9ca78f9 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.524990 4848 scope.go:117] "RemoveContainer" containerID="14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.525622 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc"} err="failed to get container status \"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\": rpc error: code = NotFound desc = could not find container \"14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc\": container with ID starting with 14bd68a021f6897006bef9f9d76e3c6d0874ed0f19709b4d5513dc3af4e05afc not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.525647 4848 scope.go:117] "RemoveContainer" containerID="6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.526011 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9"} err="failed to get container status \"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\": rpc error: code = NotFound desc = could not find container \"6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9\": container with ID starting with 6d4db9780072774d617236a116d8f643719c7ac2ce17d6002ba582dfea26e9c9 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.526037 4848 scope.go:117] "RemoveContainer" containerID="1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.526346 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88"} err="failed to get container status \"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\": rpc error: code = NotFound desc = could not find container \"1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88\": container with ID starting with 1152f374f5441614f29992f3bc48137ad392d249b81b894a4d4fa9cade40df88 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.526373 4848 scope.go:117] "RemoveContainer" containerID="05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.526634 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67"} err="failed to get container status \"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\": rpc error: code = NotFound desc = could not find container \"05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67\": container with ID starting with 05b1726bc83874eb9b8730bd960d6751e5b634d029ca01d91660855072804e67 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.526658 4848 scope.go:117] "RemoveContainer" containerID="3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.526975 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8"} err="failed to get container status \"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\": rpc error: code = NotFound desc = could not find container \"3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8\": container with ID starting with 3170c61b0bc7742d31c7610423f24a6aa9814eddd5b7bb4fd0f32dd23ad4a1c8 not found: ID does not exist" Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.571883 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:09 crc kubenswrapper[4848]: W1206 15:41:09.588142 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcb47cae_1fef_44d4_b295_23e504c9ea99.slice/crio-7913e4929e4ac761dba35d0299afb98b190f51ea01b0f00483abe864823c14ec WatchSource:0}: Error finding container 7913e4929e4ac761dba35d0299afb98b190f51ea01b0f00483abe864823c14ec: Status 404 returned error can't find the container with id 7913e4929e4ac761dba35d0299afb98b190f51ea01b0f00483abe864823c14ec Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.629824 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8g4jc"] Dec 06 15:41:09 crc kubenswrapper[4848]: I1206 15:41:09.635925 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8g4jc"] Dec 06 15:41:10 crc kubenswrapper[4848]: I1206 15:41:10.307587 4848 generic.go:334] "Generic (PLEG): container finished" podID="bcb47cae-1fef-44d4-b295-23e504c9ea99" containerID="bdf1d4a2a9232c15692aea9fc504b60f61caef104211281ba0a527990a377c53" exitCode=0 Dec 06 15:41:10 crc kubenswrapper[4848]: I1206 15:41:10.307648 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" event={"ID":"bcb47cae-1fef-44d4-b295-23e504c9ea99","Type":"ContainerDied","Data":"bdf1d4a2a9232c15692aea9fc504b60f61caef104211281ba0a527990a377c53"} Dec 06 15:41:10 crc kubenswrapper[4848]: I1206 15:41:10.308100 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" event={"ID":"bcb47cae-1fef-44d4-b295-23e504c9ea99","Type":"ContainerStarted","Data":"7913e4929e4ac761dba35d0299afb98b190f51ea01b0f00483abe864823c14ec"} Dec 06 15:41:10 crc kubenswrapper[4848]: I1206 15:41:10.321662 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qx6m8_9c409d16-f97d-4bcd-bf25-b80af1b16922/kube-multus/1.log" Dec 06 15:41:10 crc kubenswrapper[4848]: I1206 15:41:10.321976 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qx6m8" event={"ID":"9c409d16-f97d-4bcd-bf25-b80af1b16922","Type":"ContainerStarted","Data":"751b9c1907e9ef043eae48f867644e9993b8b7826aea662042b390ca1f86e425"} Dec 06 15:41:10 crc kubenswrapper[4848]: I1206 15:41:10.973733 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135" path="/var/lib/kubelet/pods/9f17f1f8-8f7c-41bc-bccf-dafeeb0b7135/volumes" Dec 06 15:41:11 crc kubenswrapper[4848]: I1206 15:41:11.329505 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" event={"ID":"bcb47cae-1fef-44d4-b295-23e504c9ea99","Type":"ContainerStarted","Data":"d007b2fb76a7415446f0749650752ed40a7e3e49ca53bf4874bb72a5c704445a"} Dec 06 15:41:11 crc kubenswrapper[4848]: I1206 15:41:11.329543 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" event={"ID":"bcb47cae-1fef-44d4-b295-23e504c9ea99","Type":"ContainerStarted","Data":"744cb384f2896177a91218e48996f51e15795e99e8af2d0af85b4984d5261aed"} Dec 06 15:41:11 crc kubenswrapper[4848]: I1206 15:41:11.329553 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" event={"ID":"bcb47cae-1fef-44d4-b295-23e504c9ea99","Type":"ContainerStarted","Data":"3d7a8f8522770abcd527a3c9c0b8e57438187981539cc5a6488f822bf79ba489"} Dec 06 15:41:11 crc kubenswrapper[4848]: I1206 15:41:11.329561 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" event={"ID":"bcb47cae-1fef-44d4-b295-23e504c9ea99","Type":"ContainerStarted","Data":"088c93861be80f2c97b21d8d4f5fdb7ee4bd6fcc023cf2e91d20b8d010123318"} Dec 06 15:41:11 crc kubenswrapper[4848]: I1206 15:41:11.329568 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" event={"ID":"bcb47cae-1fef-44d4-b295-23e504c9ea99","Type":"ContainerStarted","Data":"1c27e67eb56688bb8fb46cd4c59911b6079aa957cbd4b98f671ee290e38481e8"} Dec 06 15:41:11 crc kubenswrapper[4848]: I1206 15:41:11.329577 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" event={"ID":"bcb47cae-1fef-44d4-b295-23e504c9ea99","Type":"ContainerStarted","Data":"fb1a52f186186a255af9d4091851e75683181e90054ea736385b83cf4d35124b"} Dec 06 15:41:13 crc kubenswrapper[4848]: I1206 15:41:13.343811 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" event={"ID":"bcb47cae-1fef-44d4-b295-23e504c9ea99","Type":"ContainerStarted","Data":"f0b8a8ead1f3b5cf17a6386f138335a2c1889889a27b8a7d41c0ca8d37b07da3"} Dec 06 15:41:16 crc kubenswrapper[4848]: I1206 15:41:16.366905 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" event={"ID":"bcb47cae-1fef-44d4-b295-23e504c9ea99","Type":"ContainerStarted","Data":"786d6260ab3b94c6cb05cb06b6665a3005fc973653cd41d67ee5c223d51979d4"} Dec 06 15:41:16 crc kubenswrapper[4848]: I1206 15:41:16.367618 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:16 crc kubenswrapper[4848]: I1206 15:41:16.367640 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:16 crc kubenswrapper[4848]: I1206 15:41:16.367652 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:16 crc kubenswrapper[4848]: I1206 15:41:16.399749 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" podStartSLOduration=7.39973057 podStartE2EDuration="7.39973057s" podCreationTimestamp="2025-12-06 15:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:41:16.39863035 +0000 UTC m=+743.696641263" watchObservedRunningTime="2025-12-06 15:41:16.39973057 +0000 UTC m=+743.697741483" Dec 06 15:41:16 crc kubenswrapper[4848]: I1206 15:41:16.406133 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:16 crc kubenswrapper[4848]: I1206 15:41:16.406674 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:17 crc kubenswrapper[4848]: I1206 15:41:17.150139 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:41:17 crc kubenswrapper[4848]: I1206 15:41:17.150260 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:41:17 crc kubenswrapper[4848]: I1206 15:41:17.150333 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:41:17 crc kubenswrapper[4848]: I1206 15:41:17.151266 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e63fbea36a0e0bb825a9969be1380c579a7bc59e5ffe70c4e6a4da495e1853d8"} pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 15:41:17 crc kubenswrapper[4848]: I1206 15:41:17.151349 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" containerID="cri-o://e63fbea36a0e0bb825a9969be1380c579a7bc59e5ffe70c4e6a4da495e1853d8" gracePeriod=600 Dec 06 15:41:18 crc kubenswrapper[4848]: I1206 15:41:18.381135 4848 generic.go:334] "Generic (PLEG): container finished" podID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerID="e63fbea36a0e0bb825a9969be1380c579a7bc59e5ffe70c4e6a4da495e1853d8" exitCode=0 Dec 06 15:41:18 crc kubenswrapper[4848]: I1206 15:41:18.381210 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerDied","Data":"e63fbea36a0e0bb825a9969be1380c579a7bc59e5ffe70c4e6a4da495e1853d8"} Dec 06 15:41:18 crc kubenswrapper[4848]: I1206 15:41:18.381536 4848 scope.go:117] "RemoveContainer" containerID="dffb056b5b0c944f6ca07e0173a6a84cf532fc36395779266ccdf117768e05dc" Dec 06 15:41:19 crc kubenswrapper[4848]: I1206 15:41:19.388799 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"0c6dce4a805c82f5f7db3f50f5c57941411fd68b7c39c5fc92171551376370cc"} Dec 06 15:41:39 crc kubenswrapper[4848]: I1206 15:41:39.592406 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j9n72" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.623658 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc"] Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.625226 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.626798 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.632430 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc"] Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.722324 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtk2q\" (UniqueName: \"kubernetes.io/projected/79b3772c-7550-4073-a2cc-42508125cb74-kube-api-access-mtk2q\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.722446 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.722473 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.824474 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtk2q\" (UniqueName: \"kubernetes.io/projected/79b3772c-7550-4073-a2cc-42508125cb74-kube-api-access-mtk2q\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.824592 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.824616 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.825079 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.825145 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.847739 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtk2q\" (UniqueName: \"kubernetes.io/projected/79b3772c-7550-4073-a2cc-42508125cb74-kube-api-access-mtk2q\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:41 crc kubenswrapper[4848]: I1206 15:41:41.939973 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:42 crc kubenswrapper[4848]: I1206 15:41:42.310882 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc"] Dec 06 15:41:42 crc kubenswrapper[4848]: I1206 15:41:42.503427 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" event={"ID":"79b3772c-7550-4073-a2cc-42508125cb74","Type":"ContainerStarted","Data":"ae1379a07e07365e44518ffbb2f1ab1aeaed0647fc7a120c43ad5466622873a6"} Dec 06 15:41:42 crc kubenswrapper[4848]: I1206 15:41:42.503475 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" event={"ID":"79b3772c-7550-4073-a2cc-42508125cb74","Type":"ContainerStarted","Data":"3283002a740eeb2a0e8960449cf34dadeb11115def6d01d0da8b63d40fe0a082"} Dec 06 15:41:43 crc kubenswrapper[4848]: I1206 15:41:43.509668 4848 generic.go:334] "Generic (PLEG): container finished" podID="79b3772c-7550-4073-a2cc-42508125cb74" containerID="ae1379a07e07365e44518ffbb2f1ab1aeaed0647fc7a120c43ad5466622873a6" exitCode=0 Dec 06 15:41:43 crc kubenswrapper[4848]: I1206 15:41:43.509752 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" event={"ID":"79b3772c-7550-4073-a2cc-42508125cb74","Type":"ContainerDied","Data":"ae1379a07e07365e44518ffbb2f1ab1aeaed0647fc7a120c43ad5466622873a6"} Dec 06 15:41:43 crc kubenswrapper[4848]: I1206 15:41:43.894816 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-clvn4"] Dec 06 15:41:43 crc kubenswrapper[4848]: I1206 15:41:43.896160 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:43 crc kubenswrapper[4848]: I1206 15:41:43.902128 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-clvn4"] Dec 06 15:41:43 crc kubenswrapper[4848]: I1206 15:41:43.948870 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-catalog-content\") pod \"redhat-operators-clvn4\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:43 crc kubenswrapper[4848]: I1206 15:41:43.948922 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn8bp\" (UniqueName: \"kubernetes.io/projected/a9ebee39-b188-43cc-8a30-cdaa5c250611-kube-api-access-dn8bp\") pod \"redhat-operators-clvn4\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:43 crc kubenswrapper[4848]: I1206 15:41:43.949004 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-utilities\") pod \"redhat-operators-clvn4\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:44 crc kubenswrapper[4848]: I1206 15:41:44.049946 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-utilities\") pod \"redhat-operators-clvn4\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:44 crc kubenswrapper[4848]: I1206 15:41:44.050199 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-catalog-content\") pod \"redhat-operators-clvn4\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:44 crc kubenswrapper[4848]: I1206 15:41:44.050320 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn8bp\" (UniqueName: \"kubernetes.io/projected/a9ebee39-b188-43cc-8a30-cdaa5c250611-kube-api-access-dn8bp\") pod \"redhat-operators-clvn4\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:44 crc kubenswrapper[4848]: I1206 15:41:44.050480 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-utilities\") pod \"redhat-operators-clvn4\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:44 crc kubenswrapper[4848]: I1206 15:41:44.051131 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-catalog-content\") pod \"redhat-operators-clvn4\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:44 crc kubenswrapper[4848]: I1206 15:41:44.076930 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn8bp\" (UniqueName: \"kubernetes.io/projected/a9ebee39-b188-43cc-8a30-cdaa5c250611-kube-api-access-dn8bp\") pod \"redhat-operators-clvn4\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:44 crc kubenswrapper[4848]: I1206 15:41:44.214915 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:44 crc kubenswrapper[4848]: I1206 15:41:44.402546 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-clvn4"] Dec 06 15:41:44 crc kubenswrapper[4848]: I1206 15:41:44.517084 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clvn4" event={"ID":"a9ebee39-b188-43cc-8a30-cdaa5c250611","Type":"ContainerStarted","Data":"d2b31bb4bf5058c320e49076bab4c7c5b97c09acc6866b2c6de69697149a51c4"} Dec 06 15:41:45 crc kubenswrapper[4848]: I1206 15:41:45.523082 4848 generic.go:334] "Generic (PLEG): container finished" podID="79b3772c-7550-4073-a2cc-42508125cb74" containerID="5a56e1d886cd4acec5a0a13f5b6bd39527b04cc8446619a8889a86aa692b7126" exitCode=0 Dec 06 15:41:45 crc kubenswrapper[4848]: I1206 15:41:45.523197 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" event={"ID":"79b3772c-7550-4073-a2cc-42508125cb74","Type":"ContainerDied","Data":"5a56e1d886cd4acec5a0a13f5b6bd39527b04cc8446619a8889a86aa692b7126"} Dec 06 15:41:45 crc kubenswrapper[4848]: I1206 15:41:45.525366 4848 generic.go:334] "Generic (PLEG): container finished" podID="a9ebee39-b188-43cc-8a30-cdaa5c250611" containerID="e9b8887c14e5d780a0e8d76c240d53c0cf5e3733338598076dc5d58b9c0c78ae" exitCode=0 Dec 06 15:41:45 crc kubenswrapper[4848]: I1206 15:41:45.525421 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clvn4" event={"ID":"a9ebee39-b188-43cc-8a30-cdaa5c250611","Type":"ContainerDied","Data":"e9b8887c14e5d780a0e8d76c240d53c0cf5e3733338598076dc5d58b9c0c78ae"} Dec 06 15:41:46 crc kubenswrapper[4848]: I1206 15:41:46.532329 4848 generic.go:334] "Generic (PLEG): container finished" podID="79b3772c-7550-4073-a2cc-42508125cb74" containerID="686c6a4504a748bb76d5a84d0668c06af8fd2165ca571d19600a039cdb2c573f" exitCode=0 Dec 06 15:41:46 crc kubenswrapper[4848]: I1206 15:41:46.532382 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" event={"ID":"79b3772c-7550-4073-a2cc-42508125cb74","Type":"ContainerDied","Data":"686c6a4504a748bb76d5a84d0668c06af8fd2165ca571d19600a039cdb2c573f"} Dec 06 15:41:46 crc kubenswrapper[4848]: I1206 15:41:46.545045 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clvn4" event={"ID":"a9ebee39-b188-43cc-8a30-cdaa5c250611","Type":"ContainerStarted","Data":"ffe7a9cbb09fac2b7afd088dff748cf9e966c1a25986d0e2a946422263f7993b"} Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.551284 4848 generic.go:334] "Generic (PLEG): container finished" podID="a9ebee39-b188-43cc-8a30-cdaa5c250611" containerID="ffe7a9cbb09fac2b7afd088dff748cf9e966c1a25986d0e2a946422263f7993b" exitCode=0 Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.551385 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clvn4" event={"ID":"a9ebee39-b188-43cc-8a30-cdaa5c250611","Type":"ContainerDied","Data":"ffe7a9cbb09fac2b7afd088dff748cf9e966c1a25986d0e2a946422263f7993b"} Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.744806 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.894003 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-util\") pod \"79b3772c-7550-4073-a2cc-42508125cb74\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.894086 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtk2q\" (UniqueName: \"kubernetes.io/projected/79b3772c-7550-4073-a2cc-42508125cb74-kube-api-access-mtk2q\") pod \"79b3772c-7550-4073-a2cc-42508125cb74\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.894133 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-bundle\") pod \"79b3772c-7550-4073-a2cc-42508125cb74\" (UID: \"79b3772c-7550-4073-a2cc-42508125cb74\") " Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.894847 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-bundle" (OuterVolumeSpecName: "bundle") pod "79b3772c-7550-4073-a2cc-42508125cb74" (UID: "79b3772c-7550-4073-a2cc-42508125cb74"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.907444 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-util" (OuterVolumeSpecName: "util") pod "79b3772c-7550-4073-a2cc-42508125cb74" (UID: "79b3772c-7550-4073-a2cc-42508125cb74"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.907844 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b3772c-7550-4073-a2cc-42508125cb74-kube-api-access-mtk2q" (OuterVolumeSpecName: "kube-api-access-mtk2q") pod "79b3772c-7550-4073-a2cc-42508125cb74" (UID: "79b3772c-7550-4073-a2cc-42508125cb74"). InnerVolumeSpecName "kube-api-access-mtk2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.996527 4848 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-util\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.996566 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtk2q\" (UniqueName: \"kubernetes.io/projected/79b3772c-7550-4073-a2cc-42508125cb74-kube-api-access-mtk2q\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:47 crc kubenswrapper[4848]: I1206 15:41:47.996579 4848 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79b3772c-7550-4073-a2cc-42508125cb74-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:41:48 crc kubenswrapper[4848]: I1206 15:41:48.557576 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" Dec 06 15:41:48 crc kubenswrapper[4848]: I1206 15:41:48.557575 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc" event={"ID":"79b3772c-7550-4073-a2cc-42508125cb74","Type":"ContainerDied","Data":"3283002a740eeb2a0e8960449cf34dadeb11115def6d01d0da8b63d40fe0a082"} Dec 06 15:41:48 crc kubenswrapper[4848]: I1206 15:41:48.557723 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3283002a740eeb2a0e8960449cf34dadeb11115def6d01d0da8b63d40fe0a082" Dec 06 15:41:48 crc kubenswrapper[4848]: I1206 15:41:48.559086 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clvn4" event={"ID":"a9ebee39-b188-43cc-8a30-cdaa5c250611","Type":"ContainerStarted","Data":"e8a335579a31a365a3b4bbac2d051716bb690e9e17dd20507680d4d1afe1dc88"} Dec 06 15:41:48 crc kubenswrapper[4848]: I1206 15:41:48.576723 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-clvn4" podStartSLOduration=3.148874058 podStartE2EDuration="5.576688858s" podCreationTimestamp="2025-12-06 15:41:43 +0000 UTC" firstStartedPulling="2025-12-06 15:41:45.526989059 +0000 UTC m=+772.824999972" lastFinishedPulling="2025-12-06 15:41:47.954803869 +0000 UTC m=+775.252814772" observedRunningTime="2025-12-06 15:41:48.576076762 +0000 UTC m=+775.874087675" watchObservedRunningTime="2025-12-06 15:41:48.576688858 +0000 UTC m=+775.874699771" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.135707 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5m7pn"] Dec 06 15:41:53 crc kubenswrapper[4848]: E1206 15:41:53.136250 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b3772c-7550-4073-a2cc-42508125cb74" containerName="pull" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.136266 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b3772c-7550-4073-a2cc-42508125cb74" containerName="pull" Dec 06 15:41:53 crc kubenswrapper[4848]: E1206 15:41:53.136290 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b3772c-7550-4073-a2cc-42508125cb74" containerName="util" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.136298 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b3772c-7550-4073-a2cc-42508125cb74" containerName="util" Dec 06 15:41:53 crc kubenswrapper[4848]: E1206 15:41:53.136315 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b3772c-7550-4073-a2cc-42508125cb74" containerName="extract" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.136322 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b3772c-7550-4073-a2cc-42508125cb74" containerName="extract" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.136431 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b3772c-7550-4073-a2cc-42508125cb74" containerName="extract" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.136872 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5m7pn" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.138567 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-6pnqt" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.138978 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.139041 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.145310 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5m7pn"] Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.156487 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmpl5\" (UniqueName: \"kubernetes.io/projected/3ad4635f-e66a-4dee-a97b-e2b94ae72319-kube-api-access-dmpl5\") pod \"nmstate-operator-5b5b58f5c8-5m7pn\" (UID: \"3ad4635f-e66a-4dee-a97b-e2b94ae72319\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5m7pn" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.257043 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmpl5\" (UniqueName: \"kubernetes.io/projected/3ad4635f-e66a-4dee-a97b-e2b94ae72319-kube-api-access-dmpl5\") pod \"nmstate-operator-5b5b58f5c8-5m7pn\" (UID: \"3ad4635f-e66a-4dee-a97b-e2b94ae72319\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5m7pn" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.275318 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmpl5\" (UniqueName: \"kubernetes.io/projected/3ad4635f-e66a-4dee-a97b-e2b94ae72319-kube-api-access-dmpl5\") pod \"nmstate-operator-5b5b58f5c8-5m7pn\" (UID: \"3ad4635f-e66a-4dee-a97b-e2b94ae72319\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5m7pn" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.454438 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5m7pn" Dec 06 15:41:53 crc kubenswrapper[4848]: I1206 15:41:53.640979 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5m7pn"] Dec 06 15:41:53 crc kubenswrapper[4848]: W1206 15:41:53.645827 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ad4635f_e66a_4dee_a97b_e2b94ae72319.slice/crio-077a58d2b2ba58181973fcb6a2beb6e658f05f4feb069d29756db18f3b4b04ee WatchSource:0}: Error finding container 077a58d2b2ba58181973fcb6a2beb6e658f05f4feb069d29756db18f3b4b04ee: Status 404 returned error can't find the container with id 077a58d2b2ba58181973fcb6a2beb6e658f05f4feb069d29756db18f3b4b04ee Dec 06 15:41:54 crc kubenswrapper[4848]: I1206 15:41:54.216117 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:54 crc kubenswrapper[4848]: I1206 15:41:54.216164 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:54 crc kubenswrapper[4848]: I1206 15:41:54.264996 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:54 crc kubenswrapper[4848]: I1206 15:41:54.589948 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5m7pn" event={"ID":"3ad4635f-e66a-4dee-a97b-e2b94ae72319","Type":"ContainerStarted","Data":"077a58d2b2ba58181973fcb6a2beb6e658f05f4feb069d29756db18f3b4b04ee"} Dec 06 15:41:54 crc kubenswrapper[4848]: I1206 15:41:54.622865 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:41:56 crc kubenswrapper[4848]: I1206 15:41:56.607537 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5m7pn" event={"ID":"3ad4635f-e66a-4dee-a97b-e2b94ae72319","Type":"ContainerStarted","Data":"d8301bfb3e06e07ddfc0bc45caf27a315fd81a7d9ee5f574d2aed04232a14543"} Dec 06 15:41:56 crc kubenswrapper[4848]: I1206 15:41:56.628015 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5m7pn" podStartSLOduration=1.55832137 podStartE2EDuration="3.627997383s" podCreationTimestamp="2025-12-06 15:41:53 +0000 UTC" firstStartedPulling="2025-12-06 15:41:53.647946788 +0000 UTC m=+780.945957701" lastFinishedPulling="2025-12-06 15:41:55.717622801 +0000 UTC m=+783.015633714" observedRunningTime="2025-12-06 15:41:56.627298955 +0000 UTC m=+783.925309878" watchObservedRunningTime="2025-12-06 15:41:56.627997383 +0000 UTC m=+783.926008296" Dec 06 15:41:56 crc kubenswrapper[4848]: I1206 15:41:56.869402 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-clvn4"] Dec 06 15:41:56 crc kubenswrapper[4848]: I1206 15:41:56.869613 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-clvn4" podUID="a9ebee39-b188-43cc-8a30-cdaa5c250611" containerName="registry-server" containerID="cri-o://e8a335579a31a365a3b4bbac2d051716bb690e9e17dd20507680d4d1afe1dc88" gracePeriod=2 Dec 06 15:41:59 crc kubenswrapper[4848]: I1206 15:41:59.623987 4848 generic.go:334] "Generic (PLEG): container finished" podID="a9ebee39-b188-43cc-8a30-cdaa5c250611" containerID="e8a335579a31a365a3b4bbac2d051716bb690e9e17dd20507680d4d1afe1dc88" exitCode=0 Dec 06 15:41:59 crc kubenswrapper[4848]: I1206 15:41:59.624180 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clvn4" event={"ID":"a9ebee39-b188-43cc-8a30-cdaa5c250611","Type":"ContainerDied","Data":"e8a335579a31a365a3b4bbac2d051716bb690e9e17dd20507680d4d1afe1dc88"} Dec 06 15:41:59 crc kubenswrapper[4848]: I1206 15:41:59.897057 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.059203 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn8bp\" (UniqueName: \"kubernetes.io/projected/a9ebee39-b188-43cc-8a30-cdaa5c250611-kube-api-access-dn8bp\") pod \"a9ebee39-b188-43cc-8a30-cdaa5c250611\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.059487 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-utilities\") pod \"a9ebee39-b188-43cc-8a30-cdaa5c250611\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.059585 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-catalog-content\") pod \"a9ebee39-b188-43cc-8a30-cdaa5c250611\" (UID: \"a9ebee39-b188-43cc-8a30-cdaa5c250611\") " Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.060562 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-utilities" (OuterVolumeSpecName: "utilities") pod "a9ebee39-b188-43cc-8a30-cdaa5c250611" (UID: "a9ebee39-b188-43cc-8a30-cdaa5c250611"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.064639 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ebee39-b188-43cc-8a30-cdaa5c250611-kube-api-access-dn8bp" (OuterVolumeSpecName: "kube-api-access-dn8bp") pod "a9ebee39-b188-43cc-8a30-cdaa5c250611" (UID: "a9ebee39-b188-43cc-8a30-cdaa5c250611"). InnerVolumeSpecName "kube-api-access-dn8bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.160595 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn8bp\" (UniqueName: \"kubernetes.io/projected/a9ebee39-b188-43cc-8a30-cdaa5c250611-kube-api-access-dn8bp\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.160632 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.172046 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9ebee39-b188-43cc-8a30-cdaa5c250611" (UID: "a9ebee39-b188-43cc-8a30-cdaa5c250611"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.261265 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9ebee39-b188-43cc-8a30-cdaa5c250611-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.632252 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clvn4" event={"ID":"a9ebee39-b188-43cc-8a30-cdaa5c250611","Type":"ContainerDied","Data":"d2b31bb4bf5058c320e49076bab4c7c5b97c09acc6866b2c6de69697149a51c4"} Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.632308 4848 scope.go:117] "RemoveContainer" containerID="e8a335579a31a365a3b4bbac2d051716bb690e9e17dd20507680d4d1afe1dc88" Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.632314 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clvn4" Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.647842 4848 scope.go:117] "RemoveContainer" containerID="ffe7a9cbb09fac2b7afd088dff748cf9e966c1a25986d0e2a946422263f7993b" Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.662921 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-clvn4"] Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.665244 4848 scope.go:117] "RemoveContainer" containerID="e9b8887c14e5d780a0e8d76c240d53c0cf5e3733338598076dc5d58b9c0c78ae" Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.667282 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-clvn4"] Dec 06 15:42:00 crc kubenswrapper[4848]: I1206 15:42:00.972528 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ebee39-b188-43cc-8a30-cdaa5c250611" path="/var/lib/kubelet/pods/a9ebee39-b188-43cc-8a30-cdaa5c250611/volumes" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.229746 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl"] Dec 06 15:42:02 crc kubenswrapper[4848]: E1206 15:42:02.230758 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ebee39-b188-43cc-8a30-cdaa5c250611" containerName="extract-content" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.230840 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ebee39-b188-43cc-8a30-cdaa5c250611" containerName="extract-content" Dec 06 15:42:02 crc kubenswrapper[4848]: E1206 15:42:02.230930 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ebee39-b188-43cc-8a30-cdaa5c250611" containerName="registry-server" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.230992 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ebee39-b188-43cc-8a30-cdaa5c250611" containerName="registry-server" Dec 06 15:42:02 crc kubenswrapper[4848]: E1206 15:42:02.231078 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ebee39-b188-43cc-8a30-cdaa5c250611" containerName="extract-utilities" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.231150 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ebee39-b188-43cc-8a30-cdaa5c250611" containerName="extract-utilities" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.231330 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ebee39-b188-43cc-8a30-cdaa5c250611" containerName="registry-server" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.232075 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.236867 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l"] Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.237759 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.238221 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-p4jqz" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.240849 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.244169 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl"] Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.261634 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l"] Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.274053 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-j5wcn"] Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.274956 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.341349 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t"] Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.342160 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.344081 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.344287 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-q68jj" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.347114 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.351170 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t"] Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.384081 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzjxh\" (UniqueName: \"kubernetes.io/projected/54fb82cb-0b0a-42e8-99dc-3df8b471bd89-kube-api-access-hzjxh\") pod \"nmstate-webhook-5f6d4c5ccb-8hj7l\" (UID: \"54fb82cb-0b0a-42e8-99dc-3df8b471bd89\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.384143 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/54fb82cb-0b0a-42e8-99dc-3df8b471bd89-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8hj7l\" (UID: \"54fb82cb-0b0a-42e8-99dc-3df8b471bd89\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.384208 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/49d9817d-6554-407c-9842-c044929eb803-ovs-socket\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.384242 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/49d9817d-6554-407c-9842-c044929eb803-nmstate-lock\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.384269 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtqv\" (UniqueName: \"kubernetes.io/projected/49d9817d-6554-407c-9842-c044929eb803-kube-api-access-mbtqv\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.384294 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdk7m\" (UniqueName: \"kubernetes.io/projected/b6ab59ca-8a07-452a-bc3a-b071fccdc3ff-kube-api-access-bdk7m\") pod \"nmstate-metrics-7f946cbc9-nfpfl\" (UID: \"b6ab59ca-8a07-452a-bc3a-b071fccdc3ff\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.384376 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/49d9817d-6554-407c-9842-c044929eb803-dbus-socket\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.485391 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjxh\" (UniqueName: \"kubernetes.io/projected/54fb82cb-0b0a-42e8-99dc-3df8b471bd89-kube-api-access-hzjxh\") pod \"nmstate-webhook-5f6d4c5ccb-8hj7l\" (UID: \"54fb82cb-0b0a-42e8-99dc-3df8b471bd89\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.485822 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-f5s7t\" (UID: \"ee3d85c7-86b8-4eb5-9832-65313a05b1a4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.485859 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/54fb82cb-0b0a-42e8-99dc-3df8b471bd89-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8hj7l\" (UID: \"54fb82cb-0b0a-42e8-99dc-3df8b471bd89\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.485917 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/49d9817d-6554-407c-9842-c044929eb803-ovs-socket\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.485940 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/49d9817d-6554-407c-9842-c044929eb803-nmstate-lock\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.485977 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/49d9817d-6554-407c-9842-c044929eb803-ovs-socket\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.485994 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtqv\" (UniqueName: \"kubernetes.io/projected/49d9817d-6554-407c-9842-c044929eb803-kube-api-access-mbtqv\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.486032 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-f5s7t\" (UID: \"ee3d85c7-86b8-4eb5-9832-65313a05b1a4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.486062 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdk7m\" (UniqueName: \"kubernetes.io/projected/b6ab59ca-8a07-452a-bc3a-b071fccdc3ff-kube-api-access-bdk7m\") pod \"nmstate-metrics-7f946cbc9-nfpfl\" (UID: \"b6ab59ca-8a07-452a-bc3a-b071fccdc3ff\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.486070 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/49d9817d-6554-407c-9842-c044929eb803-nmstate-lock\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.486135 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlg7x\" (UniqueName: \"kubernetes.io/projected/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-kube-api-access-zlg7x\") pod \"nmstate-console-plugin-7fbb5f6569-f5s7t\" (UID: \"ee3d85c7-86b8-4eb5-9832-65313a05b1a4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.486176 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/49d9817d-6554-407c-9842-c044929eb803-dbus-socket\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.486440 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/49d9817d-6554-407c-9842-c044929eb803-dbus-socket\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.492648 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/54fb82cb-0b0a-42e8-99dc-3df8b471bd89-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8hj7l\" (UID: \"54fb82cb-0b0a-42e8-99dc-3df8b471bd89\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.502136 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdk7m\" (UniqueName: \"kubernetes.io/projected/b6ab59ca-8a07-452a-bc3a-b071fccdc3ff-kube-api-access-bdk7m\") pod \"nmstate-metrics-7f946cbc9-nfpfl\" (UID: \"b6ab59ca-8a07-452a-bc3a-b071fccdc3ff\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.505178 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjxh\" (UniqueName: \"kubernetes.io/projected/54fb82cb-0b0a-42e8-99dc-3df8b471bd89-kube-api-access-hzjxh\") pod \"nmstate-webhook-5f6d4c5ccb-8hj7l\" (UID: \"54fb82cb-0b0a-42e8-99dc-3df8b471bd89\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.505780 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtqv\" (UniqueName: \"kubernetes.io/projected/49d9817d-6554-407c-9842-c044929eb803-kube-api-access-mbtqv\") pod \"nmstate-handler-j5wcn\" (UID: \"49d9817d-6554-407c-9842-c044929eb803\") " pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.536353 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-657c64597b-mb4kw"] Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.537512 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.552556 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-657c64597b-mb4kw"] Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.565859 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.581149 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.592930 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.593190 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-f5s7t\" (UID: \"ee3d85c7-86b8-4eb5-9832-65313a05b1a4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.593873 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlg7x\" (UniqueName: \"kubernetes.io/projected/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-kube-api-access-zlg7x\") pod \"nmstate-console-plugin-7fbb5f6569-f5s7t\" (UID: \"ee3d85c7-86b8-4eb5-9832-65313a05b1a4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.594047 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-f5s7t\" (UID: \"ee3d85c7-86b8-4eb5-9832-65313a05b1a4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.594249 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-f5s7t\" (UID: \"ee3d85c7-86b8-4eb5-9832-65313a05b1a4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:02 crc kubenswrapper[4848]: E1206 15:42:02.594364 4848 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 06 15:42:02 crc kubenswrapper[4848]: E1206 15:42:02.594415 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-plugin-serving-cert podName:ee3d85c7-86b8-4eb5-9832-65313a05b1a4 nodeName:}" failed. No retries permitted until 2025-12-06 15:42:03.094396225 +0000 UTC m=+790.392407138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-f5s7t" (UID: "ee3d85c7-86b8-4eb5-9832-65313a05b1a4") : secret "plugin-serving-cert" not found Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.613669 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlg7x\" (UniqueName: \"kubernetes.io/projected/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-kube-api-access-zlg7x\") pod \"nmstate-console-plugin-7fbb5f6569-f5s7t\" (UID: \"ee3d85c7-86b8-4eb5-9832-65313a05b1a4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:02 crc kubenswrapper[4848]: W1206 15:42:02.628287 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49d9817d_6554_407c_9842_c044929eb803.slice/crio-89f44ea25cf6eb1eb58a2e827d816e34a3e7174732668fa4047273ae6061b19a WatchSource:0}: Error finding container 89f44ea25cf6eb1eb58a2e827d816e34a3e7174732668fa4047273ae6061b19a: Status 404 returned error can't find the container with id 89f44ea25cf6eb1eb58a2e827d816e34a3e7174732668fa4047273ae6061b19a Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.644639 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-j5wcn" event={"ID":"49d9817d-6554-407c-9842-c044929eb803","Type":"ContainerStarted","Data":"89f44ea25cf6eb1eb58a2e827d816e34a3e7174732668fa4047273ae6061b19a"} Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.696252 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-oauth-serving-cert\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.696302 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-trusted-ca-bundle\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.696320 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-console-config\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.696360 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9ffbe53-98a4-4ed0-899e-ea52877742db-console-serving-cert\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.696404 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-482qt\" (UniqueName: \"kubernetes.io/projected/b9ffbe53-98a4-4ed0-899e-ea52877742db-kube-api-access-482qt\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.696421 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9ffbe53-98a4-4ed0-899e-ea52877742db-console-oauth-config\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.696449 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-service-ca\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.797864 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-482qt\" (UniqueName: \"kubernetes.io/projected/b9ffbe53-98a4-4ed0-899e-ea52877742db-kube-api-access-482qt\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.797942 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9ffbe53-98a4-4ed0-899e-ea52877742db-console-oauth-config\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.798003 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-service-ca\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.798048 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-oauth-serving-cert\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.798130 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-trusted-ca-bundle\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.798154 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-console-config\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.798227 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9ffbe53-98a4-4ed0-899e-ea52877742db-console-serving-cert\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.799636 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-console-config\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.799797 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-service-ca\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.800076 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-oauth-serving-cert\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.801840 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9ffbe53-98a4-4ed0-899e-ea52877742db-trusted-ca-bundle\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.805399 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9ffbe53-98a4-4ed0-899e-ea52877742db-console-serving-cert\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.806482 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl"] Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.811415 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9ffbe53-98a4-4ed0-899e-ea52877742db-console-oauth-config\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: W1206 15:42:02.815644 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6ab59ca_8a07_452a_bc3a_b071fccdc3ff.slice/crio-8e22c540a2775ab552f394a3872658623e4d9e2b4a77e697809fecaffab37f13 WatchSource:0}: Error finding container 8e22c540a2775ab552f394a3872658623e4d9e2b4a77e697809fecaffab37f13: Status 404 returned error can't find the container with id 8e22c540a2775ab552f394a3872658623e4d9e2b4a77e697809fecaffab37f13 Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.815763 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-482qt\" (UniqueName: \"kubernetes.io/projected/b9ffbe53-98a4-4ed0-899e-ea52877742db-kube-api-access-482qt\") pod \"console-657c64597b-mb4kw\" (UID: \"b9ffbe53-98a4-4ed0-899e-ea52877742db\") " pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.839467 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l"] Dec 06 15:42:02 crc kubenswrapper[4848]: W1206 15:42:02.843688 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54fb82cb_0b0a_42e8_99dc_3df8b471bd89.slice/crio-8cbfedf8d56e3b072e99fcac34aacbb85d3dd5c4e868a5b88e2b0a94dd28714a WatchSource:0}: Error finding container 8cbfedf8d56e3b072e99fcac34aacbb85d3dd5c4e868a5b88e2b0a94dd28714a: Status 404 returned error can't find the container with id 8cbfedf8d56e3b072e99fcac34aacbb85d3dd5c4e868a5b88e2b0a94dd28714a Dec 06 15:42:02 crc kubenswrapper[4848]: I1206 15:42:02.866813 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:03 crc kubenswrapper[4848]: I1206 15:42:03.024611 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-657c64597b-mb4kw"] Dec 06 15:42:03 crc kubenswrapper[4848]: W1206 15:42:03.030078 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9ffbe53_98a4_4ed0_899e_ea52877742db.slice/crio-8960a1e3d8a31580a0b129557c7c6f3358734dbbe3a1e6fd59ec6db83206b6c1 WatchSource:0}: Error finding container 8960a1e3d8a31580a0b129557c7c6f3358734dbbe3a1e6fd59ec6db83206b6c1: Status 404 returned error can't find the container with id 8960a1e3d8a31580a0b129557c7c6f3358734dbbe3a1e6fd59ec6db83206b6c1 Dec 06 15:42:03 crc kubenswrapper[4848]: I1206 15:42:03.102496 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-f5s7t\" (UID: \"ee3d85c7-86b8-4eb5-9832-65313a05b1a4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:03 crc kubenswrapper[4848]: I1206 15:42:03.106078 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee3d85c7-86b8-4eb5-9832-65313a05b1a4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-f5s7t\" (UID: \"ee3d85c7-86b8-4eb5-9832-65313a05b1a4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:03 crc kubenswrapper[4848]: I1206 15:42:03.261170 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" Dec 06 15:42:03 crc kubenswrapper[4848]: I1206 15:42:03.430571 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t"] Dec 06 15:42:03 crc kubenswrapper[4848]: W1206 15:42:03.437223 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee3d85c7_86b8_4eb5_9832_65313a05b1a4.slice/crio-7acf36bedae3867f7c3858fb231993a2aa6326da4b3d4da2ba617ab3c49b6ab9 WatchSource:0}: Error finding container 7acf36bedae3867f7c3858fb231993a2aa6326da4b3d4da2ba617ab3c49b6ab9: Status 404 returned error can't find the container with id 7acf36bedae3867f7c3858fb231993a2aa6326da4b3d4da2ba617ab3c49b6ab9 Dec 06 15:42:03 crc kubenswrapper[4848]: I1206 15:42:03.651116 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl" event={"ID":"b6ab59ca-8a07-452a-bc3a-b071fccdc3ff","Type":"ContainerStarted","Data":"8e22c540a2775ab552f394a3872658623e4d9e2b4a77e697809fecaffab37f13"} Dec 06 15:42:03 crc kubenswrapper[4848]: I1206 15:42:03.652467 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-657c64597b-mb4kw" event={"ID":"b9ffbe53-98a4-4ed0-899e-ea52877742db","Type":"ContainerStarted","Data":"a4e2dedbd70a392107fa433d4fd84621c437c542d6a9fcdf5c609654d0d64eb5"} Dec 06 15:42:03 crc kubenswrapper[4848]: I1206 15:42:03.652513 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-657c64597b-mb4kw" event={"ID":"b9ffbe53-98a4-4ed0-899e-ea52877742db","Type":"ContainerStarted","Data":"8960a1e3d8a31580a0b129557c7c6f3358734dbbe3a1e6fd59ec6db83206b6c1"} Dec 06 15:42:03 crc kubenswrapper[4848]: I1206 15:42:03.655089 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" event={"ID":"54fb82cb-0b0a-42e8-99dc-3df8b471bd89","Type":"ContainerStarted","Data":"8cbfedf8d56e3b072e99fcac34aacbb85d3dd5c4e868a5b88e2b0a94dd28714a"} Dec 06 15:42:03 crc kubenswrapper[4848]: I1206 15:42:03.657936 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" event={"ID":"ee3d85c7-86b8-4eb5-9832-65313a05b1a4","Type":"ContainerStarted","Data":"7acf36bedae3867f7c3858fb231993a2aa6326da4b3d4da2ba617ab3c49b6ab9"} Dec 06 15:42:03 crc kubenswrapper[4848]: I1206 15:42:03.670296 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-657c64597b-mb4kw" podStartSLOduration=1.670275584 podStartE2EDuration="1.670275584s" podCreationTimestamp="2025-12-06 15:42:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:42:03.667048467 +0000 UTC m=+790.965059390" watchObservedRunningTime="2025-12-06 15:42:03.670275584 +0000 UTC m=+790.968286497" Dec 06 15:42:05 crc kubenswrapper[4848]: I1206 15:42:05.673607 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-j5wcn" event={"ID":"49d9817d-6554-407c-9842-c044929eb803","Type":"ContainerStarted","Data":"a7bee78a0fe75fe541083b5a74e77d9c2c6e963dbac752c20ccd4e8e11a6b616"} Dec 06 15:42:05 crc kubenswrapper[4848]: I1206 15:42:05.674818 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:05 crc kubenswrapper[4848]: I1206 15:42:05.676294 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" event={"ID":"54fb82cb-0b0a-42e8-99dc-3df8b471bd89","Type":"ContainerStarted","Data":"9fc2ac92c1e5b48c43d852db4a3cd45f1d8745520a17deacc2d4ee0bdbeba46e"} Dec 06 15:42:05 crc kubenswrapper[4848]: I1206 15:42:05.676388 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" Dec 06 15:42:05 crc kubenswrapper[4848]: I1206 15:42:05.677811 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" event={"ID":"ee3d85c7-86b8-4eb5-9832-65313a05b1a4","Type":"ContainerStarted","Data":"3e76aaeb5ecdcb303c2ae8ba248b94accc52d4cd9b55321a56825bd3d08232e8"} Dec 06 15:42:05 crc kubenswrapper[4848]: I1206 15:42:05.679918 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl" event={"ID":"b6ab59ca-8a07-452a-bc3a-b071fccdc3ff","Type":"ContainerStarted","Data":"eaa8243ad5310dc4c6a099ba10a4cd5c11e4c3b3381ebc42fcc4a3a97e6b307c"} Dec 06 15:42:05 crc kubenswrapper[4848]: I1206 15:42:05.705430 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-f5s7t" podStartSLOduration=1.856181125 podStartE2EDuration="3.705408021s" podCreationTimestamp="2025-12-06 15:42:02 +0000 UTC" firstStartedPulling="2025-12-06 15:42:03.439270771 +0000 UTC m=+790.737281684" lastFinishedPulling="2025-12-06 15:42:05.288497667 +0000 UTC m=+792.586508580" observedRunningTime="2025-12-06 15:42:05.702247744 +0000 UTC m=+793.000258677" watchObservedRunningTime="2025-12-06 15:42:05.705408021 +0000 UTC m=+793.003418954" Dec 06 15:42:05 crc kubenswrapper[4848]: I1206 15:42:05.712028 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-j5wcn" podStartSLOduration=1.032619815 podStartE2EDuration="3.712006329s" podCreationTimestamp="2025-12-06 15:42:02 +0000 UTC" firstStartedPulling="2025-12-06 15:42:02.630524704 +0000 UTC m=+789.928535617" lastFinishedPulling="2025-12-06 15:42:05.309911218 +0000 UTC m=+792.607922131" observedRunningTime="2025-12-06 15:42:05.691239716 +0000 UTC m=+792.989250679" watchObservedRunningTime="2025-12-06 15:42:05.712006329 +0000 UTC m=+793.010017252" Dec 06 15:42:05 crc kubenswrapper[4848]: I1206 15:42:05.717792 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" podStartSLOduration=1.274706289 podStartE2EDuration="3.717771335s" podCreationTimestamp="2025-12-06 15:42:02 +0000 UTC" firstStartedPulling="2025-12-06 15:42:02.845961815 +0000 UTC m=+790.143972728" lastFinishedPulling="2025-12-06 15:42:05.289026851 +0000 UTC m=+792.587037774" observedRunningTime="2025-12-06 15:42:05.716919063 +0000 UTC m=+793.014929976" watchObservedRunningTime="2025-12-06 15:42:05.717771335 +0000 UTC m=+793.015782248" Dec 06 15:42:07 crc kubenswrapper[4848]: I1206 15:42:07.693477 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl" event={"ID":"b6ab59ca-8a07-452a-bc3a-b071fccdc3ff","Type":"ContainerStarted","Data":"db4cf423785f96520b4342e2546968768ca8ad4dc4ea9d6c947b0b8e3c20a3ad"} Dec 06 15:42:07 crc kubenswrapper[4848]: I1206 15:42:07.716894 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nfpfl" podStartSLOduration=1.275019398 podStartE2EDuration="5.716869465s" podCreationTimestamp="2025-12-06 15:42:02 +0000 UTC" firstStartedPulling="2025-12-06 15:42:02.817624527 +0000 UTC m=+790.115635440" lastFinishedPulling="2025-12-06 15:42:07.259474594 +0000 UTC m=+794.557485507" observedRunningTime="2025-12-06 15:42:07.713575576 +0000 UTC m=+795.011586499" watchObservedRunningTime="2025-12-06 15:42:07.716869465 +0000 UTC m=+795.014880408" Dec 06 15:42:12 crc kubenswrapper[4848]: I1206 15:42:12.615606 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-j5wcn" Dec 06 15:42:12 crc kubenswrapper[4848]: I1206 15:42:12.867889 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:12 crc kubenswrapper[4848]: I1206 15:42:12.867970 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:12 crc kubenswrapper[4848]: I1206 15:42:12.875086 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:13 crc kubenswrapper[4848]: I1206 15:42:13.734584 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-657c64597b-mb4kw" Dec 06 15:42:13 crc kubenswrapper[4848]: I1206 15:42:13.803356 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-mdg75"] Dec 06 15:42:22 crc kubenswrapper[4848]: I1206 15:42:22.587960 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8hj7l" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.451030 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd"] Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.452608 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.454646 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.466122 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd"] Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.608400 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snmjr\" (UniqueName: \"kubernetes.io/projected/c523ae20-637c-439b-9869-98cf3ac3c8a0-kube-api-access-snmjr\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.608485 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.608588 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.709365 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.709946 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snmjr\" (UniqueName: \"kubernetes.io/projected/c523ae20-637c-439b-9869-98cf3ac3c8a0-kube-api-access-snmjr\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.709986 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.710051 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.710372 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.732491 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmjr\" (UniqueName: \"kubernetes.io/projected/c523ae20-637c-439b-9869-98cf3ac3c8a0-kube-api-access-snmjr\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:34 crc kubenswrapper[4848]: I1206 15:42:34.809558 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:35 crc kubenswrapper[4848]: I1206 15:42:35.201713 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd"] Dec 06 15:42:35 crc kubenswrapper[4848]: I1206 15:42:35.856276 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" event={"ID":"c523ae20-637c-439b-9869-98cf3ac3c8a0","Type":"ContainerStarted","Data":"1f04dcd4db112c2c520e9f70db8cca16125e5f2c7fc4ae401dcd5aa90801d22d"} Dec 06 15:42:35 crc kubenswrapper[4848]: I1206 15:42:35.856655 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" event={"ID":"c523ae20-637c-439b-9869-98cf3ac3c8a0","Type":"ContainerStarted","Data":"987516cd7234f003d5c2647378bdb2f09902093821d0e67e8b20e64ef5187f99"} Dec 06 15:42:36 crc kubenswrapper[4848]: I1206 15:42:36.863309 4848 generic.go:334] "Generic (PLEG): container finished" podID="c523ae20-637c-439b-9869-98cf3ac3c8a0" containerID="1f04dcd4db112c2c520e9f70db8cca16125e5f2c7fc4ae401dcd5aa90801d22d" exitCode=0 Dec 06 15:42:36 crc kubenswrapper[4848]: I1206 15:42:36.863354 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" event={"ID":"c523ae20-637c-439b-9869-98cf3ac3c8a0","Type":"ContainerDied","Data":"1f04dcd4db112c2c520e9f70db8cca16125e5f2c7fc4ae401dcd5aa90801d22d"} Dec 06 15:42:38 crc kubenswrapper[4848]: I1206 15:42:38.846889 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-mdg75" podUID="f250df39-ff33-455c-9edc-cb1997a8c782" containerName="console" containerID="cri-o://0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1" gracePeriod=15 Dec 06 15:42:38 crc kubenswrapper[4848]: I1206 15:42:38.876505 4848 generic.go:334] "Generic (PLEG): container finished" podID="c523ae20-637c-439b-9869-98cf3ac3c8a0" containerID="1f70d88a8a8063c14b0fcd4f1f9023175d42ff7470646c786a40aec57ade0d79" exitCode=0 Dec 06 15:42:38 crc kubenswrapper[4848]: I1206 15:42:38.876562 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" event={"ID":"c523ae20-637c-439b-9869-98cf3ac3c8a0","Type":"ContainerDied","Data":"1f70d88a8a8063c14b0fcd4f1f9023175d42ff7470646c786a40aec57ade0d79"} Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.285598 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-mdg75_f250df39-ff33-455c-9edc-cb1997a8c782/console/0.log" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.286125 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.376207 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-oauth-serving-cert\") pod \"f250df39-ff33-455c-9edc-cb1997a8c782\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.376281 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-serving-cert\") pod \"f250df39-ff33-455c-9edc-cb1997a8c782\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.376312 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-service-ca\") pod \"f250df39-ff33-455c-9edc-cb1997a8c782\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.376355 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-console-config\") pod \"f250df39-ff33-455c-9edc-cb1997a8c782\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.376441 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w7st\" (UniqueName: \"kubernetes.io/projected/f250df39-ff33-455c-9edc-cb1997a8c782-kube-api-access-5w7st\") pod \"f250df39-ff33-455c-9edc-cb1997a8c782\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.376482 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-trusted-ca-bundle\") pod \"f250df39-ff33-455c-9edc-cb1997a8c782\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.376512 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-oauth-config\") pod \"f250df39-ff33-455c-9edc-cb1997a8c782\" (UID: \"f250df39-ff33-455c-9edc-cb1997a8c782\") " Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.377317 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f250df39-ff33-455c-9edc-cb1997a8c782" (UID: "f250df39-ff33-455c-9edc-cb1997a8c782"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.377452 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f250df39-ff33-455c-9edc-cb1997a8c782" (UID: "f250df39-ff33-455c-9edc-cb1997a8c782"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.377682 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-console-config" (OuterVolumeSpecName: "console-config") pod "f250df39-ff33-455c-9edc-cb1997a8c782" (UID: "f250df39-ff33-455c-9edc-cb1997a8c782"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.378023 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-service-ca" (OuterVolumeSpecName: "service-ca") pod "f250df39-ff33-455c-9edc-cb1997a8c782" (UID: "f250df39-ff33-455c-9edc-cb1997a8c782"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.382428 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f250df39-ff33-455c-9edc-cb1997a8c782-kube-api-access-5w7st" (OuterVolumeSpecName: "kube-api-access-5w7st") pod "f250df39-ff33-455c-9edc-cb1997a8c782" (UID: "f250df39-ff33-455c-9edc-cb1997a8c782"). InnerVolumeSpecName "kube-api-access-5w7st". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.382455 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f250df39-ff33-455c-9edc-cb1997a8c782" (UID: "f250df39-ff33-455c-9edc-cb1997a8c782"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.382598 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f250df39-ff33-455c-9edc-cb1997a8c782" (UID: "f250df39-ff33-455c-9edc-cb1997a8c782"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.478018 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.478091 4848 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.478107 4848 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.478116 4848 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f250df39-ff33-455c-9edc-cb1997a8c782-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.478129 4848 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.478138 4848 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f250df39-ff33-455c-9edc-cb1997a8c782-console-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.478147 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w7st\" (UniqueName: \"kubernetes.io/projected/f250df39-ff33-455c-9edc-cb1997a8c782-kube-api-access-5w7st\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.882681 4848 generic.go:334] "Generic (PLEG): container finished" podID="c523ae20-637c-439b-9869-98cf3ac3c8a0" containerID="184788d42048a95f3df7783b5633b66a84a7662578d10f2789fb03913a65db31" exitCode=0 Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.882846 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" event={"ID":"c523ae20-637c-439b-9869-98cf3ac3c8a0","Type":"ContainerDied","Data":"184788d42048a95f3df7783b5633b66a84a7662578d10f2789fb03913a65db31"} Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.884116 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-mdg75_f250df39-ff33-455c-9edc-cb1997a8c782/console/0.log" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.884185 4848 generic.go:334] "Generic (PLEG): container finished" podID="f250df39-ff33-455c-9edc-cb1997a8c782" containerID="0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1" exitCode=2 Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.884225 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mdg75" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.884224 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mdg75" event={"ID":"f250df39-ff33-455c-9edc-cb1997a8c782","Type":"ContainerDied","Data":"0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1"} Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.884346 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mdg75" event={"ID":"f250df39-ff33-455c-9edc-cb1997a8c782","Type":"ContainerDied","Data":"29ef2bfa459ffb9a9cae93c1b05d94618b2f3ab5544c7ee1d52c8d930501739c"} Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.884369 4848 scope.go:117] "RemoveContainer" containerID="0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.903142 4848 scope.go:117] "RemoveContainer" containerID="0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1" Dec 06 15:42:39 crc kubenswrapper[4848]: E1206 15:42:39.903553 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1\": container with ID starting with 0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1 not found: ID does not exist" containerID="0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.903596 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1"} err="failed to get container status \"0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1\": rpc error: code = NotFound desc = could not find container \"0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1\": container with ID starting with 0c5561f2bb3859230b00996cd08feb0d80df1602c37e1d5ce37ac61bd01cc9e1 not found: ID does not exist" Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.918446 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-mdg75"] Dec 06 15:42:39 crc kubenswrapper[4848]: I1206 15:42:39.923165 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-mdg75"] Dec 06 15:42:40 crc kubenswrapper[4848]: I1206 15:42:40.976548 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f250df39-ff33-455c-9edc-cb1997a8c782" path="/var/lib/kubelet/pods/f250df39-ff33-455c-9edc-cb1997a8c782/volumes" Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.159720 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.303775 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-bundle\") pod \"c523ae20-637c-439b-9869-98cf3ac3c8a0\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.303910 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-util\") pod \"c523ae20-637c-439b-9869-98cf3ac3c8a0\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.303942 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snmjr\" (UniqueName: \"kubernetes.io/projected/c523ae20-637c-439b-9869-98cf3ac3c8a0-kube-api-access-snmjr\") pod \"c523ae20-637c-439b-9869-98cf3ac3c8a0\" (UID: \"c523ae20-637c-439b-9869-98cf3ac3c8a0\") " Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.306098 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-bundle" (OuterVolumeSpecName: "bundle") pod "c523ae20-637c-439b-9869-98cf3ac3c8a0" (UID: "c523ae20-637c-439b-9869-98cf3ac3c8a0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.309752 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c523ae20-637c-439b-9869-98cf3ac3c8a0-kube-api-access-snmjr" (OuterVolumeSpecName: "kube-api-access-snmjr") pod "c523ae20-637c-439b-9869-98cf3ac3c8a0" (UID: "c523ae20-637c-439b-9869-98cf3ac3c8a0"). InnerVolumeSpecName "kube-api-access-snmjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.405752 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snmjr\" (UniqueName: \"kubernetes.io/projected/c523ae20-637c-439b-9869-98cf3ac3c8a0-kube-api-access-snmjr\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.406058 4848 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.438005 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-util" (OuterVolumeSpecName: "util") pod "c523ae20-637c-439b-9869-98cf3ac3c8a0" (UID: "c523ae20-637c-439b-9869-98cf3ac3c8a0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.507378 4848 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c523ae20-637c-439b-9869-98cf3ac3c8a0-util\") on node \"crc\" DevicePath \"\"" Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.900202 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" event={"ID":"c523ae20-637c-439b-9869-98cf3ac3c8a0","Type":"ContainerDied","Data":"987516cd7234f003d5c2647378bdb2f09902093821d0e67e8b20e64ef5187f99"} Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.900247 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="987516cd7234f003d5c2647378bdb2f09902093821d0e67e8b20e64ef5187f99" Dec 06 15:42:41 crc kubenswrapper[4848]: I1206 15:42:41.900247 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.475772 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq"] Dec 06 15:42:49 crc kubenswrapper[4848]: E1206 15:42:49.476498 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f250df39-ff33-455c-9edc-cb1997a8c782" containerName="console" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.476512 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f250df39-ff33-455c-9edc-cb1997a8c782" containerName="console" Dec 06 15:42:49 crc kubenswrapper[4848]: E1206 15:42:49.476524 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c523ae20-637c-439b-9869-98cf3ac3c8a0" containerName="extract" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.476530 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c523ae20-637c-439b-9869-98cf3ac3c8a0" containerName="extract" Dec 06 15:42:49 crc kubenswrapper[4848]: E1206 15:42:49.476543 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c523ae20-637c-439b-9869-98cf3ac3c8a0" containerName="util" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.476549 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c523ae20-637c-439b-9869-98cf3ac3c8a0" containerName="util" Dec 06 15:42:49 crc kubenswrapper[4848]: E1206 15:42:49.476556 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c523ae20-637c-439b-9869-98cf3ac3c8a0" containerName="pull" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.476561 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c523ae20-637c-439b-9869-98cf3ac3c8a0" containerName="pull" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.476670 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c523ae20-637c-439b-9869-98cf3ac3c8a0" containerName="extract" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.476691 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f250df39-ff33-455c-9edc-cb1997a8c782" containerName="console" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.477204 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.482606 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.483251 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.483485 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.483787 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ff559" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.485039 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.524403 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq"] Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.607952 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w58bp\" (UniqueName: \"kubernetes.io/projected/b31b58c9-49c3-4e31-b838-5532169c319b-kube-api-access-w58bp\") pod \"metallb-operator-controller-manager-d8c5d4748-tk9lq\" (UID: \"b31b58c9-49c3-4e31-b838-5532169c319b\") " pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.608023 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b31b58c9-49c3-4e31-b838-5532169c319b-apiservice-cert\") pod \"metallb-operator-controller-manager-d8c5d4748-tk9lq\" (UID: \"b31b58c9-49c3-4e31-b838-5532169c319b\") " pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.608075 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b31b58c9-49c3-4e31-b838-5532169c319b-webhook-cert\") pod \"metallb-operator-controller-manager-d8c5d4748-tk9lq\" (UID: \"b31b58c9-49c3-4e31-b838-5532169c319b\") " pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.709109 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w58bp\" (UniqueName: \"kubernetes.io/projected/b31b58c9-49c3-4e31-b838-5532169c319b-kube-api-access-w58bp\") pod \"metallb-operator-controller-manager-d8c5d4748-tk9lq\" (UID: \"b31b58c9-49c3-4e31-b838-5532169c319b\") " pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.709183 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b31b58c9-49c3-4e31-b838-5532169c319b-apiservice-cert\") pod \"metallb-operator-controller-manager-d8c5d4748-tk9lq\" (UID: \"b31b58c9-49c3-4e31-b838-5532169c319b\") " pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.709236 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b31b58c9-49c3-4e31-b838-5532169c319b-webhook-cert\") pod \"metallb-operator-controller-manager-d8c5d4748-tk9lq\" (UID: \"b31b58c9-49c3-4e31-b838-5532169c319b\") " pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.718553 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b31b58c9-49c3-4e31-b838-5532169c319b-webhook-cert\") pod \"metallb-operator-controller-manager-d8c5d4748-tk9lq\" (UID: \"b31b58c9-49c3-4e31-b838-5532169c319b\") " pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.718612 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b31b58c9-49c3-4e31-b838-5532169c319b-apiservice-cert\") pod \"metallb-operator-controller-manager-d8c5d4748-tk9lq\" (UID: \"b31b58c9-49c3-4e31-b838-5532169c319b\") " pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.727409 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w58bp\" (UniqueName: \"kubernetes.io/projected/b31b58c9-49c3-4e31-b838-5532169c319b-kube-api-access-w58bp\") pod \"metallb-operator-controller-manager-d8c5d4748-tk9lq\" (UID: \"b31b58c9-49c3-4e31-b838-5532169c319b\") " pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.783940 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9"] Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.784836 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.787019 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.787040 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-cn5vq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.787297 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.795123 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.862034 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9"] Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.912650 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d03f9739-5d90-45b0-a717-bc66f0522234-webhook-cert\") pod \"metallb-operator-webhook-server-57cbc9df7c-kltd9\" (UID: \"d03f9739-5d90-45b0-a717-bc66f0522234\") " pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.912745 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d03f9739-5d90-45b0-a717-bc66f0522234-apiservice-cert\") pod \"metallb-operator-webhook-server-57cbc9df7c-kltd9\" (UID: \"d03f9739-5d90-45b0-a717-bc66f0522234\") " pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:49 crc kubenswrapper[4848]: I1206 15:42:49.912768 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c4rf\" (UniqueName: \"kubernetes.io/projected/d03f9739-5d90-45b0-a717-bc66f0522234-kube-api-access-5c4rf\") pod \"metallb-operator-webhook-server-57cbc9df7c-kltd9\" (UID: \"d03f9739-5d90-45b0-a717-bc66f0522234\") " pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:50 crc kubenswrapper[4848]: I1206 15:42:50.014164 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d03f9739-5d90-45b0-a717-bc66f0522234-webhook-cert\") pod \"metallb-operator-webhook-server-57cbc9df7c-kltd9\" (UID: \"d03f9739-5d90-45b0-a717-bc66f0522234\") " pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:50 crc kubenswrapper[4848]: I1206 15:42:50.014286 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c4rf\" (UniqueName: \"kubernetes.io/projected/d03f9739-5d90-45b0-a717-bc66f0522234-kube-api-access-5c4rf\") pod \"metallb-operator-webhook-server-57cbc9df7c-kltd9\" (UID: \"d03f9739-5d90-45b0-a717-bc66f0522234\") " pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:50 crc kubenswrapper[4848]: I1206 15:42:50.014311 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d03f9739-5d90-45b0-a717-bc66f0522234-apiservice-cert\") pod \"metallb-operator-webhook-server-57cbc9df7c-kltd9\" (UID: \"d03f9739-5d90-45b0-a717-bc66f0522234\") " pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:50 crc kubenswrapper[4848]: I1206 15:42:50.019824 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d03f9739-5d90-45b0-a717-bc66f0522234-webhook-cert\") pod \"metallb-operator-webhook-server-57cbc9df7c-kltd9\" (UID: \"d03f9739-5d90-45b0-a717-bc66f0522234\") " pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:50 crc kubenswrapper[4848]: I1206 15:42:50.026357 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d03f9739-5d90-45b0-a717-bc66f0522234-apiservice-cert\") pod \"metallb-operator-webhook-server-57cbc9df7c-kltd9\" (UID: \"d03f9739-5d90-45b0-a717-bc66f0522234\") " pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:50 crc kubenswrapper[4848]: I1206 15:42:50.040746 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c4rf\" (UniqueName: \"kubernetes.io/projected/d03f9739-5d90-45b0-a717-bc66f0522234-kube-api-access-5c4rf\") pod \"metallb-operator-webhook-server-57cbc9df7c-kltd9\" (UID: \"d03f9739-5d90-45b0-a717-bc66f0522234\") " pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:50 crc kubenswrapper[4848]: I1206 15:42:50.105359 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:50 crc kubenswrapper[4848]: I1206 15:42:50.287858 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq"] Dec 06 15:42:50 crc kubenswrapper[4848]: I1206 15:42:50.559321 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9"] Dec 06 15:42:50 crc kubenswrapper[4848]: W1206 15:42:50.561669 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd03f9739_5d90_45b0_a717_bc66f0522234.slice/crio-b37a7cdd3843dc2d7d1e70f060ab8354e3b6f187b75d2281b1121a030a911aaa WatchSource:0}: Error finding container b37a7cdd3843dc2d7d1e70f060ab8354e3b6f187b75d2281b1121a030a911aaa: Status 404 returned error can't find the container with id b37a7cdd3843dc2d7d1e70f060ab8354e3b6f187b75d2281b1121a030a911aaa Dec 06 15:42:50 crc kubenswrapper[4848]: I1206 15:42:50.953681 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" event={"ID":"d03f9739-5d90-45b0-a717-bc66f0522234","Type":"ContainerStarted","Data":"b37a7cdd3843dc2d7d1e70f060ab8354e3b6f187b75d2281b1121a030a911aaa"} Dec 06 15:42:50 crc kubenswrapper[4848]: I1206 15:42:50.954988 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" event={"ID":"b31b58c9-49c3-4e31-b838-5532169c319b","Type":"ContainerStarted","Data":"ded67c9004232c88cc82d93c247112973610634545d9f1a1dcd33f43e5c5e1df"} Dec 06 15:42:55 crc kubenswrapper[4848]: I1206 15:42:55.988103 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" event={"ID":"d03f9739-5d90-45b0-a717-bc66f0522234","Type":"ContainerStarted","Data":"d9c0785c6bd3ec08b836406d47aafa6ca29f6ceb9b3be088437a6439326f9514"} Dec 06 15:42:55 crc kubenswrapper[4848]: I1206 15:42:55.988637 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:42:56 crc kubenswrapper[4848]: I1206 15:42:56.008854 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" podStartSLOduration=2.270157517 podStartE2EDuration="7.008836592s" podCreationTimestamp="2025-12-06 15:42:49 +0000 UTC" firstStartedPulling="2025-12-06 15:42:50.566185941 +0000 UTC m=+837.864196864" lastFinishedPulling="2025-12-06 15:42:55.304865026 +0000 UTC m=+842.602875939" observedRunningTime="2025-12-06 15:42:56.007613629 +0000 UTC m=+843.305624552" watchObservedRunningTime="2025-12-06 15:42:56.008836592 +0000 UTC m=+843.306847505" Dec 06 15:42:56 crc kubenswrapper[4848]: I1206 15:42:56.995968 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" event={"ID":"b31b58c9-49c3-4e31-b838-5532169c319b","Type":"ContainerStarted","Data":"f767e37e88ed15f443e1ec01d5f074150867a516ad382a1e71f65b8ef3ccba23"} Dec 06 15:42:56 crc kubenswrapper[4848]: I1206 15:42:56.996288 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:42:57 crc kubenswrapper[4848]: I1206 15:42:57.017686 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" podStartSLOduration=1.495060911 podStartE2EDuration="8.017665323s" podCreationTimestamp="2025-12-06 15:42:49 +0000 UTC" firstStartedPulling="2025-12-06 15:42:50.294199256 +0000 UTC m=+837.592210159" lastFinishedPulling="2025-12-06 15:42:56.816803658 +0000 UTC m=+844.114814571" observedRunningTime="2025-12-06 15:42:57.015102025 +0000 UTC m=+844.313112948" watchObservedRunningTime="2025-12-06 15:42:57.017665323 +0000 UTC m=+844.315676236" Dec 06 15:43:10 crc kubenswrapper[4848]: I1206 15:43:10.110159 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-57cbc9df7c-kltd9" Dec 06 15:43:26 crc kubenswrapper[4848]: I1206 15:43:26.895380 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w2tb4"] Dec 06 15:43:26 crc kubenswrapper[4848]: I1206 15:43:26.897565 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:26 crc kubenswrapper[4848]: I1206 15:43:26.910486 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w2tb4"] Dec 06 15:43:27 crc kubenswrapper[4848]: I1206 15:43:27.005006 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-utilities\") pod \"certified-operators-w2tb4\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:27 crc kubenswrapper[4848]: I1206 15:43:27.005156 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-catalog-content\") pod \"certified-operators-w2tb4\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:27 crc kubenswrapper[4848]: I1206 15:43:27.005234 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlpnm\" (UniqueName: \"kubernetes.io/projected/247cc0d5-df71-4d57-bf35-bc008d40a441-kube-api-access-wlpnm\") pod \"certified-operators-w2tb4\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:27 crc kubenswrapper[4848]: I1206 15:43:27.106718 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-utilities\") pod \"certified-operators-w2tb4\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:27 crc kubenswrapper[4848]: I1206 15:43:27.106808 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-catalog-content\") pod \"certified-operators-w2tb4\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:27 crc kubenswrapper[4848]: I1206 15:43:27.106838 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlpnm\" (UniqueName: \"kubernetes.io/projected/247cc0d5-df71-4d57-bf35-bc008d40a441-kube-api-access-wlpnm\") pod \"certified-operators-w2tb4\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:27 crc kubenswrapper[4848]: I1206 15:43:27.107463 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-utilities\") pod \"certified-operators-w2tb4\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:27 crc kubenswrapper[4848]: I1206 15:43:27.107501 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-catalog-content\") pod \"certified-operators-w2tb4\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:27 crc kubenswrapper[4848]: I1206 15:43:27.126736 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlpnm\" (UniqueName: \"kubernetes.io/projected/247cc0d5-df71-4d57-bf35-bc008d40a441-kube-api-access-wlpnm\") pod \"certified-operators-w2tb4\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:27 crc kubenswrapper[4848]: I1206 15:43:27.226042 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:27 crc kubenswrapper[4848]: I1206 15:43:27.709294 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w2tb4"] Dec 06 15:43:28 crc kubenswrapper[4848]: I1206 15:43:28.169406 4848 generic.go:334] "Generic (PLEG): container finished" podID="247cc0d5-df71-4d57-bf35-bc008d40a441" containerID="d147cc1a8140e9ecb78c5ba232b3f64a49cfe8af2ea8d933a79956bb621ed652" exitCode=0 Dec 06 15:43:28 crc kubenswrapper[4848]: I1206 15:43:28.169555 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2tb4" event={"ID":"247cc0d5-df71-4d57-bf35-bc008d40a441","Type":"ContainerDied","Data":"d147cc1a8140e9ecb78c5ba232b3f64a49cfe8af2ea8d933a79956bb621ed652"} Dec 06 15:43:28 crc kubenswrapper[4848]: I1206 15:43:28.172087 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2tb4" event={"ID":"247cc0d5-df71-4d57-bf35-bc008d40a441","Type":"ContainerStarted","Data":"81a71031713438d3c016ab19da0aa48d92137e4d3f5c37cfb8ee392c30be4444"} Dec 06 15:43:29 crc kubenswrapper[4848]: I1206 15:43:29.178267 4848 generic.go:334] "Generic (PLEG): container finished" podID="247cc0d5-df71-4d57-bf35-bc008d40a441" containerID="123fa7af9985b4d1d11761d9b378dddd4e6e4c7e3925683797a5b449a275419d" exitCode=0 Dec 06 15:43:29 crc kubenswrapper[4848]: I1206 15:43:29.178302 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2tb4" event={"ID":"247cc0d5-df71-4d57-bf35-bc008d40a441","Type":"ContainerDied","Data":"123fa7af9985b4d1d11761d9b378dddd4e6e4c7e3925683797a5b449a275419d"} Dec 06 15:43:29 crc kubenswrapper[4848]: I1206 15:43:29.797886 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-d8c5d4748-tk9lq" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.186137 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2tb4" event={"ID":"247cc0d5-df71-4d57-bf35-bc008d40a441","Type":"ContainerStarted","Data":"c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c"} Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.215684 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w2tb4" podStartSLOduration=2.844185094 podStartE2EDuration="4.215662437s" podCreationTimestamp="2025-12-06 15:43:26 +0000 UTC" firstStartedPulling="2025-12-06 15:43:28.171124056 +0000 UTC m=+875.469134969" lastFinishedPulling="2025-12-06 15:43:29.542601409 +0000 UTC m=+876.840612312" observedRunningTime="2025-12-06 15:43:30.210415995 +0000 UTC m=+877.508426908" watchObservedRunningTime="2025-12-06 15:43:30.215662437 +0000 UTC m=+877.513673350" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.611275 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pqzrf"] Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.613820 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.617197 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.617470 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-js2qj" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.617197 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.628913 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq"] Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.629756 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.632332 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.640751 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq"] Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.660376 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-metrics\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.660431 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebaeee0b-28be-4402-8323-57dad52490a6-metrics-certs\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.660464 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-reloader\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.660492 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql69f\" (UniqueName: \"kubernetes.io/projected/ebaeee0b-28be-4402-8323-57dad52490a6-kube-api-access-ql69f\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.660513 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-frr-sockets\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.660542 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-frr-conf\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.660569 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99f1be97-6216-4892-8da4-32dac60daaaa-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-ptrrq\" (UID: \"99f1be97-6216-4892-8da4-32dac60daaaa\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.660615 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ebaeee0b-28be-4402-8323-57dad52490a6-frr-startup\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.660641 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn2ch\" (UniqueName: \"kubernetes.io/projected/99f1be97-6216-4892-8da4-32dac60daaaa-kube-api-access-sn2ch\") pod \"frr-k8s-webhook-server-7fcb986d4-ptrrq\" (UID: \"99f1be97-6216-4892-8da4-32dac60daaaa\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.700063 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-4b5lc"] Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.703377 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4b5lc" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.707843 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.708083 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5pjvh" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.708240 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.708383 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.719387 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-9xn7s"] Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.720454 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.724893 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.730554 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-9xn7s"] Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.761662 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fjc6\" (UniqueName: \"kubernetes.io/projected/1649535c-6c66-412c-be24-f452edfe82a1-kube-api-access-8fjc6\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.761795 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53fefa8d-ef28-4a20-8e75-b633d64f4863-cert\") pod \"controller-f8648f98b-9xn7s\" (UID: \"53fefa8d-ef28-4a20-8e75-b633d64f4863\") " pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.761837 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-metrics\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.761859 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebaeee0b-28be-4402-8323-57dad52490a6-metrics-certs\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.761880 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-reloader\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.761896 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-metrics-certs\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.761917 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql69f\" (UniqueName: \"kubernetes.io/projected/ebaeee0b-28be-4402-8323-57dad52490a6-kube-api-access-ql69f\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.761932 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-frr-sockets\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.761959 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfmmb\" (UniqueName: \"kubernetes.io/projected/53fefa8d-ef28-4a20-8e75-b633d64f4863-kube-api-access-lfmmb\") pod \"controller-f8648f98b-9xn7s\" (UID: \"53fefa8d-ef28-4a20-8e75-b633d64f4863\") " pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.761975 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-frr-conf\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.761993 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53fefa8d-ef28-4a20-8e75-b633d64f4863-metrics-certs\") pod \"controller-f8648f98b-9xn7s\" (UID: \"53fefa8d-ef28-4a20-8e75-b633d64f4863\") " pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.762009 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99f1be97-6216-4892-8da4-32dac60daaaa-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-ptrrq\" (UID: \"99f1be97-6216-4892-8da4-32dac60daaaa\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.762031 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1649535c-6c66-412c-be24-f452edfe82a1-metallb-excludel2\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.762058 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-memberlist\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.762075 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ebaeee0b-28be-4402-8323-57dad52490a6-frr-startup\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.762097 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn2ch\" (UniqueName: \"kubernetes.io/projected/99f1be97-6216-4892-8da4-32dac60daaaa-kube-api-access-sn2ch\") pod \"frr-k8s-webhook-server-7fcb986d4-ptrrq\" (UID: \"99f1be97-6216-4892-8da4-32dac60daaaa\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.762750 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-metrics\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.763417 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-reloader\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.763811 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-frr-conf\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.764767 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ebaeee0b-28be-4402-8323-57dad52490a6-frr-sockets\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.768367 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebaeee0b-28be-4402-8323-57dad52490a6-metrics-certs\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.769778 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ebaeee0b-28be-4402-8323-57dad52490a6-frr-startup\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.780268 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99f1be97-6216-4892-8da4-32dac60daaaa-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-ptrrq\" (UID: \"99f1be97-6216-4892-8da4-32dac60daaaa\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.786164 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn2ch\" (UniqueName: \"kubernetes.io/projected/99f1be97-6216-4892-8da4-32dac60daaaa-kube-api-access-sn2ch\") pod \"frr-k8s-webhook-server-7fcb986d4-ptrrq\" (UID: \"99f1be97-6216-4892-8da4-32dac60daaaa\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.793829 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql69f\" (UniqueName: \"kubernetes.io/projected/ebaeee0b-28be-4402-8323-57dad52490a6-kube-api-access-ql69f\") pod \"frr-k8s-pqzrf\" (UID: \"ebaeee0b-28be-4402-8323-57dad52490a6\") " pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.863441 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-metrics-certs\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.863509 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfmmb\" (UniqueName: \"kubernetes.io/projected/53fefa8d-ef28-4a20-8e75-b633d64f4863-kube-api-access-lfmmb\") pod \"controller-f8648f98b-9xn7s\" (UID: \"53fefa8d-ef28-4a20-8e75-b633d64f4863\") " pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.863537 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53fefa8d-ef28-4a20-8e75-b633d64f4863-metrics-certs\") pod \"controller-f8648f98b-9xn7s\" (UID: \"53fefa8d-ef28-4a20-8e75-b633d64f4863\") " pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.863566 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1649535c-6c66-412c-be24-f452edfe82a1-metallb-excludel2\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.863603 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-memberlist\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:30 crc kubenswrapper[4848]: E1206 15:43:30.863614 4848 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.863634 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fjc6\" (UniqueName: \"kubernetes.io/projected/1649535c-6c66-412c-be24-f452edfe82a1-kube-api-access-8fjc6\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.863658 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53fefa8d-ef28-4a20-8e75-b633d64f4863-cert\") pod \"controller-f8648f98b-9xn7s\" (UID: \"53fefa8d-ef28-4a20-8e75-b633d64f4863\") " pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:30 crc kubenswrapper[4848]: E1206 15:43:30.863676 4848 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 06 15:43:30 crc kubenswrapper[4848]: E1206 15:43:30.863679 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-metrics-certs podName:1649535c-6c66-412c-be24-f452edfe82a1 nodeName:}" failed. No retries permitted until 2025-12-06 15:43:31.363658766 +0000 UTC m=+878.661669679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-metrics-certs") pod "speaker-4b5lc" (UID: "1649535c-6c66-412c-be24-f452edfe82a1") : secret "speaker-certs-secret" not found Dec 06 15:43:30 crc kubenswrapper[4848]: E1206 15:43:30.863730 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53fefa8d-ef28-4a20-8e75-b633d64f4863-metrics-certs podName:53fefa8d-ef28-4a20-8e75-b633d64f4863 nodeName:}" failed. No retries permitted until 2025-12-06 15:43:31.363720657 +0000 UTC m=+878.661731570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53fefa8d-ef28-4a20-8e75-b633d64f4863-metrics-certs") pod "controller-f8648f98b-9xn7s" (UID: "53fefa8d-ef28-4a20-8e75-b633d64f4863") : secret "controller-certs-secret" not found Dec 06 15:43:30 crc kubenswrapper[4848]: E1206 15:43:30.863823 4848 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 06 15:43:30 crc kubenswrapper[4848]: E1206 15:43:30.863923 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-memberlist podName:1649535c-6c66-412c-be24-f452edfe82a1 nodeName:}" failed. No retries permitted until 2025-12-06 15:43:31.363900492 +0000 UTC m=+878.661911485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-memberlist") pod "speaker-4b5lc" (UID: "1649535c-6c66-412c-be24-f452edfe82a1") : secret "metallb-memberlist" not found Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.864379 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1649535c-6c66-412c-be24-f452edfe82a1-metallb-excludel2\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.869792 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.878091 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53fefa8d-ef28-4a20-8e75-b633d64f4863-cert\") pod \"controller-f8648f98b-9xn7s\" (UID: \"53fefa8d-ef28-4a20-8e75-b633d64f4863\") " pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.887488 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfmmb\" (UniqueName: \"kubernetes.io/projected/53fefa8d-ef28-4a20-8e75-b633d64f4863-kube-api-access-lfmmb\") pod \"controller-f8648f98b-9xn7s\" (UID: \"53fefa8d-ef28-4a20-8e75-b633d64f4863\") " pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.894275 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fjc6\" (UniqueName: \"kubernetes.io/projected/1649535c-6c66-412c-be24-f452edfe82a1-kube-api-access-8fjc6\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.932596 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:30 crc kubenswrapper[4848]: I1206 15:43:30.946063 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" Dec 06 15:43:31 crc kubenswrapper[4848]: I1206 15:43:31.125906 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq"] Dec 06 15:43:31 crc kubenswrapper[4848]: W1206 15:43:31.135059 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99f1be97_6216_4892_8da4_32dac60daaaa.slice/crio-5974d073a2903f597c1bea6cbb514a29d93e50dcbe409bf1887c98d50bd94d01 WatchSource:0}: Error finding container 5974d073a2903f597c1bea6cbb514a29d93e50dcbe409bf1887c98d50bd94d01: Status 404 returned error can't find the container with id 5974d073a2903f597c1bea6cbb514a29d93e50dcbe409bf1887c98d50bd94d01 Dec 06 15:43:31 crc kubenswrapper[4848]: I1206 15:43:31.193737 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" event={"ID":"99f1be97-6216-4892-8da4-32dac60daaaa","Type":"ContainerStarted","Data":"5974d073a2903f597c1bea6cbb514a29d93e50dcbe409bf1887c98d50bd94d01"} Dec 06 15:43:31 crc kubenswrapper[4848]: I1206 15:43:31.194624 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pqzrf" event={"ID":"ebaeee0b-28be-4402-8323-57dad52490a6","Type":"ContainerStarted","Data":"07cb0f9a405db0e34cf8d9e2be4357e451c628039a450bef611b51c944fc2c43"} Dec 06 15:43:31 crc kubenswrapper[4848]: I1206 15:43:31.370049 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53fefa8d-ef28-4a20-8e75-b633d64f4863-metrics-certs\") pod \"controller-f8648f98b-9xn7s\" (UID: \"53fefa8d-ef28-4a20-8e75-b633d64f4863\") " pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:31 crc kubenswrapper[4848]: I1206 15:43:31.370117 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-memberlist\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:31 crc kubenswrapper[4848]: I1206 15:43:31.370177 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-metrics-certs\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:31 crc kubenswrapper[4848]: E1206 15:43:31.370370 4848 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 06 15:43:31 crc kubenswrapper[4848]: E1206 15:43:31.370455 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-memberlist podName:1649535c-6c66-412c-be24-f452edfe82a1 nodeName:}" failed. No retries permitted until 2025-12-06 15:43:32.370436966 +0000 UTC m=+879.668447879 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-memberlist") pod "speaker-4b5lc" (UID: "1649535c-6c66-412c-be24-f452edfe82a1") : secret "metallb-memberlist" not found Dec 06 15:43:31 crc kubenswrapper[4848]: I1206 15:43:31.374865 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53fefa8d-ef28-4a20-8e75-b633d64f4863-metrics-certs\") pod \"controller-f8648f98b-9xn7s\" (UID: \"53fefa8d-ef28-4a20-8e75-b633d64f4863\") " pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:31 crc kubenswrapper[4848]: I1206 15:43:31.374878 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-metrics-certs\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:31 crc kubenswrapper[4848]: I1206 15:43:31.641993 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:32 crc kubenswrapper[4848]: I1206 15:43:32.049380 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-9xn7s"] Dec 06 15:43:32 crc kubenswrapper[4848]: I1206 15:43:32.201460 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-9xn7s" event={"ID":"53fefa8d-ef28-4a20-8e75-b633d64f4863","Type":"ContainerStarted","Data":"992f1169d541a2d6fd0b696f42850d19c5dbf7f82d4e3b7863f217a69916c2f7"} Dec 06 15:43:32 crc kubenswrapper[4848]: I1206 15:43:32.202681 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-9xn7s" event={"ID":"53fefa8d-ef28-4a20-8e75-b633d64f4863","Type":"ContainerStarted","Data":"a16c222b0c059dc8b4c58e63a2346d5e6820c1c39b18d52498d21f1dee55194a"} Dec 06 15:43:32 crc kubenswrapper[4848]: I1206 15:43:32.384843 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-memberlist\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:32 crc kubenswrapper[4848]: I1206 15:43:32.397272 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1649535c-6c66-412c-be24-f452edfe82a1-memberlist\") pod \"speaker-4b5lc\" (UID: \"1649535c-6c66-412c-be24-f452edfe82a1\") " pod="metallb-system/speaker-4b5lc" Dec 06 15:43:32 crc kubenswrapper[4848]: I1206 15:43:32.529819 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4b5lc" Dec 06 15:43:33 crc kubenswrapper[4848]: I1206 15:43:33.211503 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4b5lc" event={"ID":"1649535c-6c66-412c-be24-f452edfe82a1","Type":"ContainerStarted","Data":"a9a68658a8c68c5994d4977e2d02f92c58939700ecfa6240e58bf2e342fdcc89"} Dec 06 15:43:33 crc kubenswrapper[4848]: I1206 15:43:33.211887 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4b5lc" event={"ID":"1649535c-6c66-412c-be24-f452edfe82a1","Type":"ContainerStarted","Data":"e49300540ad926cd1677f596d308506e09dcb02ef301dfb216adcc4ad78b7a14"} Dec 06 15:43:33 crc kubenswrapper[4848]: I1206 15:43:33.211903 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4b5lc" event={"ID":"1649535c-6c66-412c-be24-f452edfe82a1","Type":"ContainerStarted","Data":"057266c7b7b04ecaadcee82dd85dc08fd37fc726cbad3e7e0024fa7e6cd073be"} Dec 06 15:43:33 crc kubenswrapper[4848]: I1206 15:43:33.212100 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-4b5lc" Dec 06 15:43:33 crc kubenswrapper[4848]: I1206 15:43:33.214257 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-9xn7s" event={"ID":"53fefa8d-ef28-4a20-8e75-b633d64f4863","Type":"ContainerStarted","Data":"9008fac8b0df8c4ec7d9e4c584754acd554494dd102b0885febab9ea01b2f7d9"} Dec 06 15:43:33 crc kubenswrapper[4848]: I1206 15:43:33.214448 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:33 crc kubenswrapper[4848]: I1206 15:43:33.233042 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-4b5lc" podStartSLOduration=3.233023374 podStartE2EDuration="3.233023374s" podCreationTimestamp="2025-12-06 15:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:43:33.227589567 +0000 UTC m=+880.525600490" watchObservedRunningTime="2025-12-06 15:43:33.233023374 +0000 UTC m=+880.531034287" Dec 06 15:43:33 crc kubenswrapper[4848]: I1206 15:43:33.246714 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-9xn7s" podStartSLOduration=3.246677294 podStartE2EDuration="3.246677294s" podCreationTimestamp="2025-12-06 15:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:43:33.243159068 +0000 UTC m=+880.541169991" watchObservedRunningTime="2025-12-06 15:43:33.246677294 +0000 UTC m=+880.544688207" Dec 06 15:43:37 crc kubenswrapper[4848]: I1206 15:43:37.227169 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:37 crc kubenswrapper[4848]: I1206 15:43:37.227522 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:37 crc kubenswrapper[4848]: I1206 15:43:37.325443 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:37 crc kubenswrapper[4848]: I1206 15:43:37.371901 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:37 crc kubenswrapper[4848]: I1206 15:43:37.554951 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w2tb4"] Dec 06 15:43:39 crc kubenswrapper[4848]: I1206 15:43:39.303818 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" event={"ID":"99f1be97-6216-4892-8da4-32dac60daaaa","Type":"ContainerStarted","Data":"0fffbf273c12069894acd12c1c417457280dd33c4f0bf2a815bcd2ced8357d95"} Dec 06 15:43:39 crc kubenswrapper[4848]: I1206 15:43:39.304176 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" Dec 06 15:43:39 crc kubenswrapper[4848]: I1206 15:43:39.308483 4848 generic.go:334] "Generic (PLEG): container finished" podID="ebaeee0b-28be-4402-8323-57dad52490a6" containerID="36eed2f0721a52ac0d97925f0670b903a1ae75410cb92525180e811f3b9228c8" exitCode=0 Dec 06 15:43:39 crc kubenswrapper[4848]: I1206 15:43:39.308546 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pqzrf" event={"ID":"ebaeee0b-28be-4402-8323-57dad52490a6","Type":"ContainerDied","Data":"36eed2f0721a52ac0d97925f0670b903a1ae75410cb92525180e811f3b9228c8"} Dec 06 15:43:39 crc kubenswrapper[4848]: I1206 15:43:39.308710 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w2tb4" podUID="247cc0d5-df71-4d57-bf35-bc008d40a441" containerName="registry-server" containerID="cri-o://c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c" gracePeriod=2 Dec 06 15:43:39 crc kubenswrapper[4848]: I1206 15:43:39.323414 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" podStartSLOduration=1.701837541 podStartE2EDuration="9.323396206s" podCreationTimestamp="2025-12-06 15:43:30 +0000 UTC" firstStartedPulling="2025-12-06 15:43:31.13675358 +0000 UTC m=+878.434764493" lastFinishedPulling="2025-12-06 15:43:38.758312245 +0000 UTC m=+886.056323158" observedRunningTime="2025-12-06 15:43:39.319979663 +0000 UTC m=+886.617990576" watchObservedRunningTime="2025-12-06 15:43:39.323396206 +0000 UTC m=+886.621407119" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.185498 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.217500 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-utilities\") pod \"247cc0d5-df71-4d57-bf35-bc008d40a441\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.217709 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-catalog-content\") pod \"247cc0d5-df71-4d57-bf35-bc008d40a441\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.217750 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlpnm\" (UniqueName: \"kubernetes.io/projected/247cc0d5-df71-4d57-bf35-bc008d40a441-kube-api-access-wlpnm\") pod \"247cc0d5-df71-4d57-bf35-bc008d40a441\" (UID: \"247cc0d5-df71-4d57-bf35-bc008d40a441\") " Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.218332 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-utilities" (OuterVolumeSpecName: "utilities") pod "247cc0d5-df71-4d57-bf35-bc008d40a441" (UID: "247cc0d5-df71-4d57-bf35-bc008d40a441"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.224956 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247cc0d5-df71-4d57-bf35-bc008d40a441-kube-api-access-wlpnm" (OuterVolumeSpecName: "kube-api-access-wlpnm") pod "247cc0d5-df71-4d57-bf35-bc008d40a441" (UID: "247cc0d5-df71-4d57-bf35-bc008d40a441"). InnerVolumeSpecName "kube-api-access-wlpnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.260102 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "247cc0d5-df71-4d57-bf35-bc008d40a441" (UID: "247cc0d5-df71-4d57-bf35-bc008d40a441"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.316087 4848 generic.go:334] "Generic (PLEG): container finished" podID="247cc0d5-df71-4d57-bf35-bc008d40a441" containerID="c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c" exitCode=0 Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.316119 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2tb4" event={"ID":"247cc0d5-df71-4d57-bf35-bc008d40a441","Type":"ContainerDied","Data":"c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c"} Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.316159 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2tb4" event={"ID":"247cc0d5-df71-4d57-bf35-bc008d40a441","Type":"ContainerDied","Data":"81a71031713438d3c016ab19da0aa48d92137e4d3f5c37cfb8ee392c30be4444"} Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.316169 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2tb4" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.316221 4848 scope.go:117] "RemoveContainer" containerID="c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.317942 4848 generic.go:334] "Generic (PLEG): container finished" podID="ebaeee0b-28be-4402-8323-57dad52490a6" containerID="1cb4bcc88adaf4e353496945d489c1725770bbd37e30ca7e601aab39019fc1e5" exitCode=0 Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.318015 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pqzrf" event={"ID":"ebaeee0b-28be-4402-8323-57dad52490a6","Type":"ContainerDied","Data":"1cb4bcc88adaf4e353496945d489c1725770bbd37e30ca7e601aab39019fc1e5"} Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.319207 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.319226 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247cc0d5-df71-4d57-bf35-bc008d40a441-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.319239 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlpnm\" (UniqueName: \"kubernetes.io/projected/247cc0d5-df71-4d57-bf35-bc008d40a441-kube-api-access-wlpnm\") on node \"crc\" DevicePath \"\"" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.345965 4848 scope.go:117] "RemoveContainer" containerID="123fa7af9985b4d1d11761d9b378dddd4e6e4c7e3925683797a5b449a275419d" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.364841 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w2tb4"] Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.375872 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w2tb4"] Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.378793 4848 scope.go:117] "RemoveContainer" containerID="d147cc1a8140e9ecb78c5ba232b3f64a49cfe8af2ea8d933a79956bb621ed652" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.393583 4848 scope.go:117] "RemoveContainer" containerID="c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c" Dec 06 15:43:40 crc kubenswrapper[4848]: E1206 15:43:40.393946 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c\": container with ID starting with c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c not found: ID does not exist" containerID="c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.393975 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c"} err="failed to get container status \"c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c\": rpc error: code = NotFound desc = could not find container \"c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c\": container with ID starting with c0b51c13e914b540320809ac8aa73db33c9f09c1b8cd265f6184000ebc55519c not found: ID does not exist" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.393995 4848 scope.go:117] "RemoveContainer" containerID="123fa7af9985b4d1d11761d9b378dddd4e6e4c7e3925683797a5b449a275419d" Dec 06 15:43:40 crc kubenswrapper[4848]: E1206 15:43:40.394287 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"123fa7af9985b4d1d11761d9b378dddd4e6e4c7e3925683797a5b449a275419d\": container with ID starting with 123fa7af9985b4d1d11761d9b378dddd4e6e4c7e3925683797a5b449a275419d not found: ID does not exist" containerID="123fa7af9985b4d1d11761d9b378dddd4e6e4c7e3925683797a5b449a275419d" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.394307 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"123fa7af9985b4d1d11761d9b378dddd4e6e4c7e3925683797a5b449a275419d"} err="failed to get container status \"123fa7af9985b4d1d11761d9b378dddd4e6e4c7e3925683797a5b449a275419d\": rpc error: code = NotFound desc = could not find container \"123fa7af9985b4d1d11761d9b378dddd4e6e4c7e3925683797a5b449a275419d\": container with ID starting with 123fa7af9985b4d1d11761d9b378dddd4e6e4c7e3925683797a5b449a275419d not found: ID does not exist" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.394318 4848 scope.go:117] "RemoveContainer" containerID="d147cc1a8140e9ecb78c5ba232b3f64a49cfe8af2ea8d933a79956bb621ed652" Dec 06 15:43:40 crc kubenswrapper[4848]: E1206 15:43:40.394512 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d147cc1a8140e9ecb78c5ba232b3f64a49cfe8af2ea8d933a79956bb621ed652\": container with ID starting with d147cc1a8140e9ecb78c5ba232b3f64a49cfe8af2ea8d933a79956bb621ed652 not found: ID does not exist" containerID="d147cc1a8140e9ecb78c5ba232b3f64a49cfe8af2ea8d933a79956bb621ed652" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.394530 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d147cc1a8140e9ecb78c5ba232b3f64a49cfe8af2ea8d933a79956bb621ed652"} err="failed to get container status \"d147cc1a8140e9ecb78c5ba232b3f64a49cfe8af2ea8d933a79956bb621ed652\": rpc error: code = NotFound desc = could not find container \"d147cc1a8140e9ecb78c5ba232b3f64a49cfe8af2ea8d933a79956bb621ed652\": container with ID starting with d147cc1a8140e9ecb78c5ba232b3f64a49cfe8af2ea8d933a79956bb621ed652 not found: ID does not exist" Dec 06 15:43:40 crc kubenswrapper[4848]: I1206 15:43:40.979262 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247cc0d5-df71-4d57-bf35-bc008d40a441" path="/var/lib/kubelet/pods/247cc0d5-df71-4d57-bf35-bc008d40a441/volumes" Dec 06 15:43:41 crc kubenswrapper[4848]: I1206 15:43:41.330510 4848 generic.go:334] "Generic (PLEG): container finished" podID="ebaeee0b-28be-4402-8323-57dad52490a6" containerID="2f616b5ba29eaadd82e331746447191830704374d3309b9983bcd2bac904296f" exitCode=0 Dec 06 15:43:41 crc kubenswrapper[4848]: I1206 15:43:41.330585 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pqzrf" event={"ID":"ebaeee0b-28be-4402-8323-57dad52490a6","Type":"ContainerDied","Data":"2f616b5ba29eaadd82e331746447191830704374d3309b9983bcd2bac904296f"} Dec 06 15:43:41 crc kubenswrapper[4848]: I1206 15:43:41.968477 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l45vl"] Dec 06 15:43:41 crc kubenswrapper[4848]: E1206 15:43:41.969655 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247cc0d5-df71-4d57-bf35-bc008d40a441" containerName="registry-server" Dec 06 15:43:41 crc kubenswrapper[4848]: I1206 15:43:41.969684 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="247cc0d5-df71-4d57-bf35-bc008d40a441" containerName="registry-server" Dec 06 15:43:41 crc kubenswrapper[4848]: E1206 15:43:41.969720 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247cc0d5-df71-4d57-bf35-bc008d40a441" containerName="extract-content" Dec 06 15:43:41 crc kubenswrapper[4848]: I1206 15:43:41.969729 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="247cc0d5-df71-4d57-bf35-bc008d40a441" containerName="extract-content" Dec 06 15:43:41 crc kubenswrapper[4848]: E1206 15:43:41.969751 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247cc0d5-df71-4d57-bf35-bc008d40a441" containerName="extract-utilities" Dec 06 15:43:41 crc kubenswrapper[4848]: I1206 15:43:41.969760 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="247cc0d5-df71-4d57-bf35-bc008d40a441" containerName="extract-utilities" Dec 06 15:43:41 crc kubenswrapper[4848]: I1206 15:43:41.969913 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="247cc0d5-df71-4d57-bf35-bc008d40a441" containerName="registry-server" Dec 06 15:43:41 crc kubenswrapper[4848]: I1206 15:43:41.970863 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:41 crc kubenswrapper[4848]: I1206 15:43:41.977311 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l45vl"] Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.038208 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b6mn\" (UniqueName: \"kubernetes.io/projected/2363ddf7-2521-46fc-b2a3-240626ae9b6d-kube-api-access-7b6mn\") pod \"redhat-marketplace-l45vl\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.038358 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-catalog-content\") pod \"redhat-marketplace-l45vl\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.038377 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-utilities\") pod \"redhat-marketplace-l45vl\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.139276 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-catalog-content\") pod \"redhat-marketplace-l45vl\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.139323 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-utilities\") pod \"redhat-marketplace-l45vl\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.139344 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b6mn\" (UniqueName: \"kubernetes.io/projected/2363ddf7-2521-46fc-b2a3-240626ae9b6d-kube-api-access-7b6mn\") pod \"redhat-marketplace-l45vl\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.139776 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-utilities\") pod \"redhat-marketplace-l45vl\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.141113 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-catalog-content\") pod \"redhat-marketplace-l45vl\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.159164 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b6mn\" (UniqueName: \"kubernetes.io/projected/2363ddf7-2521-46fc-b2a3-240626ae9b6d-kube-api-access-7b6mn\") pod \"redhat-marketplace-l45vl\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.303688 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.344466 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pqzrf" event={"ID":"ebaeee0b-28be-4402-8323-57dad52490a6","Type":"ContainerStarted","Data":"e1bce23e7f4e1ebdf4b77f5b0b8075227d0d6e19193628bab6f6dd0ebef39551"} Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.344521 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pqzrf" event={"ID":"ebaeee0b-28be-4402-8323-57dad52490a6","Type":"ContainerStarted","Data":"a80eac01536f763d99f23b24a32f144f205aa7564421f7550225970ba2c23913"} Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.344536 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pqzrf" event={"ID":"ebaeee0b-28be-4402-8323-57dad52490a6","Type":"ContainerStarted","Data":"e5ce741cd3c4e52da8ff2eca7b9efa2efbf39db08669cdb6b18efd3aeeea2826"} Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.344586 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pqzrf" event={"ID":"ebaeee0b-28be-4402-8323-57dad52490a6","Type":"ContainerStarted","Data":"8a1b6230459575686c3457f1f96d68fb0d21db1a7d4c54a02d6502d6215c6388"} Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.344602 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pqzrf" event={"ID":"ebaeee0b-28be-4402-8323-57dad52490a6","Type":"ContainerStarted","Data":"5cf647e7c98fdeff597c2fe3b723c4fb55a34459543a6a3ce500b8dc632d1d91"} Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.344613 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pqzrf" event={"ID":"ebaeee0b-28be-4402-8323-57dad52490a6","Type":"ContainerStarted","Data":"3efd4cf416dd22d0efdcd89992691fb59400d533a8879804af32031d4507b0b0"} Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.345834 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.533292 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-4b5lc" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.554841 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pqzrf" podStartSLOduration=4.87312143 podStartE2EDuration="12.554822147s" podCreationTimestamp="2025-12-06 15:43:30 +0000 UTC" firstStartedPulling="2025-12-06 15:43:31.059373921 +0000 UTC m=+878.357384834" lastFinishedPulling="2025-12-06 15:43:38.741074638 +0000 UTC m=+886.039085551" observedRunningTime="2025-12-06 15:43:42.417457722 +0000 UTC m=+889.715468635" watchObservedRunningTime="2025-12-06 15:43:42.554822147 +0000 UTC m=+889.852833060" Dec 06 15:43:42 crc kubenswrapper[4848]: I1206 15:43:42.747143 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l45vl"] Dec 06 15:43:43 crc kubenswrapper[4848]: I1206 15:43:43.350469 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l45vl" event={"ID":"2363ddf7-2521-46fc-b2a3-240626ae9b6d","Type":"ContainerStarted","Data":"c5e2f071a70a070a4fa6163b31d155818e2d4e54b08df674a9c743c84feae6c0"} Dec 06 15:43:44 crc kubenswrapper[4848]: I1206 15:43:44.356313 4848 generic.go:334] "Generic (PLEG): container finished" podID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" containerID="a5d8df2292436eca591dffde6fc6fff5998c40e4fdbd9b0e6749568dedb00502" exitCode=0 Dec 06 15:43:44 crc kubenswrapper[4848]: I1206 15:43:44.356407 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l45vl" event={"ID":"2363ddf7-2521-46fc-b2a3-240626ae9b6d","Type":"ContainerDied","Data":"a5d8df2292436eca591dffde6fc6fff5998c40e4fdbd9b0e6749568dedb00502"} Dec 06 15:43:45 crc kubenswrapper[4848]: I1206 15:43:45.363759 4848 generic.go:334] "Generic (PLEG): container finished" podID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" containerID="560c97fbaafacdb9f49e23e84f6298ef40adce3f3a96a882eb0dae0ab5b3a195" exitCode=0 Dec 06 15:43:45 crc kubenswrapper[4848]: I1206 15:43:45.363799 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l45vl" event={"ID":"2363ddf7-2521-46fc-b2a3-240626ae9b6d","Type":"ContainerDied","Data":"560c97fbaafacdb9f49e23e84f6298ef40adce3f3a96a882eb0dae0ab5b3a195"} Dec 06 15:43:45 crc kubenswrapper[4848]: I1206 15:43:45.935012 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:45 crc kubenswrapper[4848]: I1206 15:43:45.977712 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.371446 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l45vl" event={"ID":"2363ddf7-2521-46fc-b2a3-240626ae9b6d","Type":"ContainerStarted","Data":"1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f"} Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.391092 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l45vl" podStartSLOduration=3.825501818 podStartE2EDuration="5.391069794s" podCreationTimestamp="2025-12-06 15:43:41 +0000 UTC" firstStartedPulling="2025-12-06 15:43:44.35751223 +0000 UTC m=+891.655523143" lastFinishedPulling="2025-12-06 15:43:45.923080206 +0000 UTC m=+893.221091119" observedRunningTime="2025-12-06 15:43:46.388832764 +0000 UTC m=+893.686843677" watchObservedRunningTime="2025-12-06 15:43:46.391069794 +0000 UTC m=+893.689080707" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.567196 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2sxks"] Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.568258 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.588097 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2sxks"] Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.606807 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-utilities\") pod \"community-operators-2sxks\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.606868 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwm7k\" (UniqueName: \"kubernetes.io/projected/dd43c1db-7add-4e59-ad28-95d13ff32f1a-kube-api-access-kwm7k\") pod \"community-operators-2sxks\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.606923 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-catalog-content\") pod \"community-operators-2sxks\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.708086 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-utilities\") pod \"community-operators-2sxks\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.708143 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwm7k\" (UniqueName: \"kubernetes.io/projected/dd43c1db-7add-4e59-ad28-95d13ff32f1a-kube-api-access-kwm7k\") pod \"community-operators-2sxks\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.708194 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-catalog-content\") pod \"community-operators-2sxks\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.708640 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-catalog-content\") pod \"community-operators-2sxks\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.708686 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-utilities\") pod \"community-operators-2sxks\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.726570 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwm7k\" (UniqueName: \"kubernetes.io/projected/dd43c1db-7add-4e59-ad28-95d13ff32f1a-kube-api-access-kwm7k\") pod \"community-operators-2sxks\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:46 crc kubenswrapper[4848]: I1206 15:43:46.882600 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:47 crc kubenswrapper[4848]: I1206 15:43:47.150216 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:43:47 crc kubenswrapper[4848]: I1206 15:43:47.150578 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:43:47 crc kubenswrapper[4848]: I1206 15:43:47.320803 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2sxks"] Dec 06 15:43:47 crc kubenswrapper[4848]: W1206 15:43:47.322379 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd43c1db_7add_4e59_ad28_95d13ff32f1a.slice/crio-faaab887f6b43ae1dfdbee55e2710123675dd59f6c0e48494460ca87bd664cb9 WatchSource:0}: Error finding container faaab887f6b43ae1dfdbee55e2710123675dd59f6c0e48494460ca87bd664cb9: Status 404 returned error can't find the container with id faaab887f6b43ae1dfdbee55e2710123675dd59f6c0e48494460ca87bd664cb9 Dec 06 15:43:47 crc kubenswrapper[4848]: I1206 15:43:47.381623 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2sxks" event={"ID":"dd43c1db-7add-4e59-ad28-95d13ff32f1a","Type":"ContainerStarted","Data":"faaab887f6b43ae1dfdbee55e2710123675dd59f6c0e48494460ca87bd664cb9"} Dec 06 15:43:48 crc kubenswrapper[4848]: I1206 15:43:48.388909 4848 generic.go:334] "Generic (PLEG): container finished" podID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" containerID="28f6eb388d0b0d1ca658e9a0697b3428745e7641a393e42b78bd8217b1ff0e3c" exitCode=0 Dec 06 15:43:48 crc kubenswrapper[4848]: I1206 15:43:48.389013 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2sxks" event={"ID":"dd43c1db-7add-4e59-ad28-95d13ff32f1a","Type":"ContainerDied","Data":"28f6eb388d0b0d1ca658e9a0697b3428745e7641a393e42b78bd8217b1ff0e3c"} Dec 06 15:43:49 crc kubenswrapper[4848]: I1206 15:43:49.395656 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2sxks" event={"ID":"dd43c1db-7add-4e59-ad28-95d13ff32f1a","Type":"ContainerStarted","Data":"4118dbec2a5dcababf76f49080c58868c60ec19c93d34ee5a5c99c5cceb41308"} Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.402146 4848 generic.go:334] "Generic (PLEG): container finished" podID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" containerID="4118dbec2a5dcababf76f49080c58868c60ec19c93d34ee5a5c99c5cceb41308" exitCode=0 Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.402223 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2sxks" event={"ID":"dd43c1db-7add-4e59-ad28-95d13ff32f1a","Type":"ContainerDied","Data":"4118dbec2a5dcababf76f49080c58868c60ec19c93d34ee5a5c99c5cceb41308"} Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.765063 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-62kg6"] Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.765808 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-62kg6" Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.768053 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.774194 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-mh45j" Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.775301 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-62kg6"] Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.780399 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.860250 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnzfv\" (UniqueName: \"kubernetes.io/projected/10365f81-1470-441b-9533-24f6a526fe55-kube-api-access-bnzfv\") pod \"openstack-operator-index-62kg6\" (UID: \"10365f81-1470-441b-9533-24f6a526fe55\") " pod="openstack-operators/openstack-operator-index-62kg6" Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.950202 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-ptrrq" Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.961264 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnzfv\" (UniqueName: \"kubernetes.io/projected/10365f81-1470-441b-9533-24f6a526fe55-kube-api-access-bnzfv\") pod \"openstack-operator-index-62kg6\" (UID: \"10365f81-1470-441b-9533-24f6a526fe55\") " pod="openstack-operators/openstack-operator-index-62kg6" Dec 06 15:43:50 crc kubenswrapper[4848]: I1206 15:43:50.981674 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnzfv\" (UniqueName: \"kubernetes.io/projected/10365f81-1470-441b-9533-24f6a526fe55-kube-api-access-bnzfv\") pod \"openstack-operator-index-62kg6\" (UID: \"10365f81-1470-441b-9533-24f6a526fe55\") " pod="openstack-operators/openstack-operator-index-62kg6" Dec 06 15:43:51 crc kubenswrapper[4848]: I1206 15:43:51.082625 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-62kg6" Dec 06 15:43:51 crc kubenswrapper[4848]: I1206 15:43:51.409820 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2sxks" event={"ID":"dd43c1db-7add-4e59-ad28-95d13ff32f1a","Type":"ContainerStarted","Data":"1d0bf90ced86bfd02dac23fbd271de61ad418adbb4e0c1a9a7bcd9aac864de92"} Dec 06 15:43:51 crc kubenswrapper[4848]: I1206 15:43:51.422825 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2sxks" podStartSLOduration=2.751441819 podStartE2EDuration="5.422808235s" podCreationTimestamp="2025-12-06 15:43:46 +0000 UTC" firstStartedPulling="2025-12-06 15:43:48.390941886 +0000 UTC m=+895.688952799" lastFinishedPulling="2025-12-06 15:43:51.062308302 +0000 UTC m=+898.360319215" observedRunningTime="2025-12-06 15:43:51.422355093 +0000 UTC m=+898.720366006" watchObservedRunningTime="2025-12-06 15:43:51.422808235 +0000 UTC m=+898.720819148" Dec 06 15:43:51 crc kubenswrapper[4848]: I1206 15:43:51.501815 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-62kg6"] Dec 06 15:43:51 crc kubenswrapper[4848]: W1206 15:43:51.515799 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10365f81_1470_441b_9533_24f6a526fe55.slice/crio-1699e5da49d01137d7f409e9dcea836b468458ead09421ba526f88fb41f4242f WatchSource:0}: Error finding container 1699e5da49d01137d7f409e9dcea836b468458ead09421ba526f88fb41f4242f: Status 404 returned error can't find the container with id 1699e5da49d01137d7f409e9dcea836b468458ead09421ba526f88fb41f4242f Dec 06 15:43:51 crc kubenswrapper[4848]: I1206 15:43:51.651813 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-9xn7s" Dec 06 15:43:52 crc kubenswrapper[4848]: I1206 15:43:52.312175 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:52 crc kubenswrapper[4848]: I1206 15:43:52.312219 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:52 crc kubenswrapper[4848]: I1206 15:43:52.363210 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:52 crc kubenswrapper[4848]: I1206 15:43:52.416102 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-62kg6" event={"ID":"10365f81-1470-441b-9533-24f6a526fe55","Type":"ContainerStarted","Data":"1699e5da49d01137d7f409e9dcea836b468458ead09421ba526f88fb41f4242f"} Dec 06 15:43:52 crc kubenswrapper[4848]: I1206 15:43:52.457001 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:54 crc kubenswrapper[4848]: I1206 15:43:54.428207 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-62kg6" event={"ID":"10365f81-1470-441b-9533-24f6a526fe55","Type":"ContainerStarted","Data":"edb63c34b0046e6d065f3db799a04039b5376b1ba42a2621a48811b25cf17a82"} Dec 06 15:43:54 crc kubenswrapper[4848]: I1206 15:43:54.442106 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-62kg6" podStartSLOduration=2.197828697 podStartE2EDuration="4.442089574s" podCreationTimestamp="2025-12-06 15:43:50 +0000 UTC" firstStartedPulling="2025-12-06 15:43:51.51705868 +0000 UTC m=+898.815069593" lastFinishedPulling="2025-12-06 15:43:53.761319557 +0000 UTC m=+901.059330470" observedRunningTime="2025-12-06 15:43:54.439686109 +0000 UTC m=+901.737697022" watchObservedRunningTime="2025-12-06 15:43:54.442089574 +0000 UTC m=+901.740100487" Dec 06 15:43:55 crc kubenswrapper[4848]: I1206 15:43:55.552820 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l45vl"] Dec 06 15:43:55 crc kubenswrapper[4848]: I1206 15:43:55.553057 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l45vl" podUID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" containerName="registry-server" containerID="cri-o://1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f" gracePeriod=2 Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.402045 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.440987 4848 generic.go:334] "Generic (PLEG): container finished" podID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" containerID="1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f" exitCode=0 Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.441033 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l45vl" event={"ID":"2363ddf7-2521-46fc-b2a3-240626ae9b6d","Type":"ContainerDied","Data":"1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f"} Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.441061 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l45vl" event={"ID":"2363ddf7-2521-46fc-b2a3-240626ae9b6d","Type":"ContainerDied","Data":"c5e2f071a70a070a4fa6163b31d155818e2d4e54b08df674a9c743c84feae6c0"} Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.441080 4848 scope.go:117] "RemoveContainer" containerID="1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.441178 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l45vl" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.458278 4848 scope.go:117] "RemoveContainer" containerID="560c97fbaafacdb9f49e23e84f6298ef40adce3f3a96a882eb0dae0ab5b3a195" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.474819 4848 scope.go:117] "RemoveContainer" containerID="a5d8df2292436eca591dffde6fc6fff5998c40e4fdbd9b0e6749568dedb00502" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.497057 4848 scope.go:117] "RemoveContainer" containerID="1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f" Dec 06 15:43:56 crc kubenswrapper[4848]: E1206 15:43:56.497526 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f\": container with ID starting with 1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f not found: ID does not exist" containerID="1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.497564 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f"} err="failed to get container status \"1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f\": rpc error: code = NotFound desc = could not find container \"1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f\": container with ID starting with 1a69ae7e26eef54a5bd5aad63e43e860e275f80a3d2a5a37657d69411199460f not found: ID does not exist" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.497591 4848 scope.go:117] "RemoveContainer" containerID="560c97fbaafacdb9f49e23e84f6298ef40adce3f3a96a882eb0dae0ab5b3a195" Dec 06 15:43:56 crc kubenswrapper[4848]: E1206 15:43:56.498045 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560c97fbaafacdb9f49e23e84f6298ef40adce3f3a96a882eb0dae0ab5b3a195\": container with ID starting with 560c97fbaafacdb9f49e23e84f6298ef40adce3f3a96a882eb0dae0ab5b3a195 not found: ID does not exist" containerID="560c97fbaafacdb9f49e23e84f6298ef40adce3f3a96a882eb0dae0ab5b3a195" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.498066 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560c97fbaafacdb9f49e23e84f6298ef40adce3f3a96a882eb0dae0ab5b3a195"} err="failed to get container status \"560c97fbaafacdb9f49e23e84f6298ef40adce3f3a96a882eb0dae0ab5b3a195\": rpc error: code = NotFound desc = could not find container \"560c97fbaafacdb9f49e23e84f6298ef40adce3f3a96a882eb0dae0ab5b3a195\": container with ID starting with 560c97fbaafacdb9f49e23e84f6298ef40adce3f3a96a882eb0dae0ab5b3a195 not found: ID does not exist" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.498082 4848 scope.go:117] "RemoveContainer" containerID="a5d8df2292436eca591dffde6fc6fff5998c40e4fdbd9b0e6749568dedb00502" Dec 06 15:43:56 crc kubenswrapper[4848]: E1206 15:43:56.498321 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d8df2292436eca591dffde6fc6fff5998c40e4fdbd9b0e6749568dedb00502\": container with ID starting with a5d8df2292436eca591dffde6fc6fff5998c40e4fdbd9b0e6749568dedb00502 not found: ID does not exist" containerID="a5d8df2292436eca591dffde6fc6fff5998c40e4fdbd9b0e6749568dedb00502" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.498341 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d8df2292436eca591dffde6fc6fff5998c40e4fdbd9b0e6749568dedb00502"} err="failed to get container status \"a5d8df2292436eca591dffde6fc6fff5998c40e4fdbd9b0e6749568dedb00502\": rpc error: code = NotFound desc = could not find container \"a5d8df2292436eca591dffde6fc6fff5998c40e4fdbd9b0e6749568dedb00502\": container with ID starting with a5d8df2292436eca591dffde6fc6fff5998c40e4fdbd9b0e6749568dedb00502 not found: ID does not exist" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.533377 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-utilities\") pod \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.533441 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b6mn\" (UniqueName: \"kubernetes.io/projected/2363ddf7-2521-46fc-b2a3-240626ae9b6d-kube-api-access-7b6mn\") pod \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.533505 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-catalog-content\") pod \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\" (UID: \"2363ddf7-2521-46fc-b2a3-240626ae9b6d\") " Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.537366 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-utilities" (OuterVolumeSpecName: "utilities") pod "2363ddf7-2521-46fc-b2a3-240626ae9b6d" (UID: "2363ddf7-2521-46fc-b2a3-240626ae9b6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.539625 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2363ddf7-2521-46fc-b2a3-240626ae9b6d-kube-api-access-7b6mn" (OuterVolumeSpecName: "kube-api-access-7b6mn") pod "2363ddf7-2521-46fc-b2a3-240626ae9b6d" (UID: "2363ddf7-2521-46fc-b2a3-240626ae9b6d"). InnerVolumeSpecName "kube-api-access-7b6mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.551440 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2363ddf7-2521-46fc-b2a3-240626ae9b6d" (UID: "2363ddf7-2521-46fc-b2a3-240626ae9b6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.634622 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.634667 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b6mn\" (UniqueName: \"kubernetes.io/projected/2363ddf7-2521-46fc-b2a3-240626ae9b6d-kube-api-access-7b6mn\") on node \"crc\" DevicePath \"\"" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.634678 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363ddf7-2521-46fc-b2a3-240626ae9b6d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.763682 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l45vl"] Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.767560 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l45vl"] Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.883366 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.883420 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.937860 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:56 crc kubenswrapper[4848]: I1206 15:43:56.975317 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" path="/var/lib/kubelet/pods/2363ddf7-2521-46fc-b2a3-240626ae9b6d/volumes" Dec 06 15:43:57 crc kubenswrapper[4848]: I1206 15:43:57.485766 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:43:58 crc kubenswrapper[4848]: I1206 15:43:58.158799 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2sxks"] Dec 06 15:43:59 crc kubenswrapper[4848]: I1206 15:43:59.456176 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2sxks" podUID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" containerName="registry-server" containerID="cri-o://1d0bf90ced86bfd02dac23fbd271de61ad418adbb4e0c1a9a7bcd9aac864de92" gracePeriod=2 Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.467243 4848 generic.go:334] "Generic (PLEG): container finished" podID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" containerID="1d0bf90ced86bfd02dac23fbd271de61ad418adbb4e0c1a9a7bcd9aac864de92" exitCode=0 Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.467739 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2sxks" event={"ID":"dd43c1db-7add-4e59-ad28-95d13ff32f1a","Type":"ContainerDied","Data":"1d0bf90ced86bfd02dac23fbd271de61ad418adbb4e0c1a9a7bcd9aac864de92"} Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.521033 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.589024 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwm7k\" (UniqueName: \"kubernetes.io/projected/dd43c1db-7add-4e59-ad28-95d13ff32f1a-kube-api-access-kwm7k\") pod \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.589091 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-utilities\") pod \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.589148 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-catalog-content\") pod \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\" (UID: \"dd43c1db-7add-4e59-ad28-95d13ff32f1a\") " Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.590145 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-utilities" (OuterVolumeSpecName: "utilities") pod "dd43c1db-7add-4e59-ad28-95d13ff32f1a" (UID: "dd43c1db-7add-4e59-ad28-95d13ff32f1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.594331 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd43c1db-7add-4e59-ad28-95d13ff32f1a-kube-api-access-kwm7k" (OuterVolumeSpecName: "kube-api-access-kwm7k") pod "dd43c1db-7add-4e59-ad28-95d13ff32f1a" (UID: "dd43c1db-7add-4e59-ad28-95d13ff32f1a"). InnerVolumeSpecName "kube-api-access-kwm7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.638359 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd43c1db-7add-4e59-ad28-95d13ff32f1a" (UID: "dd43c1db-7add-4e59-ad28-95d13ff32f1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.690359 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwm7k\" (UniqueName: \"kubernetes.io/projected/dd43c1db-7add-4e59-ad28-95d13ff32f1a-kube-api-access-kwm7k\") on node \"crc\" DevicePath \"\"" Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.690611 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.690671 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd43c1db-7add-4e59-ad28-95d13ff32f1a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:44:00 crc kubenswrapper[4848]: I1206 15:44:00.936459 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pqzrf" Dec 06 15:44:01 crc kubenswrapper[4848]: I1206 15:44:01.083308 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-62kg6" Dec 06 15:44:01 crc kubenswrapper[4848]: I1206 15:44:01.083366 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-62kg6" Dec 06 15:44:01 crc kubenswrapper[4848]: I1206 15:44:01.110842 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-62kg6" Dec 06 15:44:01 crc kubenswrapper[4848]: I1206 15:44:01.477405 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2sxks" Dec 06 15:44:01 crc kubenswrapper[4848]: I1206 15:44:01.477964 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2sxks" event={"ID":"dd43c1db-7add-4e59-ad28-95d13ff32f1a","Type":"ContainerDied","Data":"faaab887f6b43ae1dfdbee55e2710123675dd59f6c0e48494460ca87bd664cb9"} Dec 06 15:44:01 crc kubenswrapper[4848]: I1206 15:44:01.478016 4848 scope.go:117] "RemoveContainer" containerID="1d0bf90ced86bfd02dac23fbd271de61ad418adbb4e0c1a9a7bcd9aac864de92" Dec 06 15:44:01 crc kubenswrapper[4848]: I1206 15:44:01.499273 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2sxks"] Dec 06 15:44:01 crc kubenswrapper[4848]: I1206 15:44:01.499676 4848 scope.go:117] "RemoveContainer" containerID="4118dbec2a5dcababf76f49080c58868c60ec19c93d34ee5a5c99c5cceb41308" Dec 06 15:44:01 crc kubenswrapper[4848]: I1206 15:44:01.504752 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2sxks"] Dec 06 15:44:01 crc kubenswrapper[4848]: I1206 15:44:01.507840 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-62kg6" Dec 06 15:44:01 crc kubenswrapper[4848]: I1206 15:44:01.519615 4848 scope.go:117] "RemoveContainer" containerID="28f6eb388d0b0d1ca658e9a0697b3428745e7641a393e42b78bd8217b1ff0e3c" Dec 06 15:44:02 crc kubenswrapper[4848]: I1206 15:44:02.974424 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" path="/var/lib/kubelet/pods/dd43c1db-7add-4e59-ad28-95d13ff32f1a/volumes" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.798375 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg"] Dec 06 15:44:04 crc kubenswrapper[4848]: E1206 15:44:04.798599 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" containerName="extract-utilities" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.798611 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" containerName="extract-utilities" Dec 06 15:44:04 crc kubenswrapper[4848]: E1206 15:44:04.798622 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" containerName="registry-server" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.798628 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" containerName="registry-server" Dec 06 15:44:04 crc kubenswrapper[4848]: E1206 15:44:04.798644 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" containerName="extract-content" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.798650 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" containerName="extract-content" Dec 06 15:44:04 crc kubenswrapper[4848]: E1206 15:44:04.798660 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" containerName="extract-utilities" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.798666 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" containerName="extract-utilities" Dec 06 15:44:04 crc kubenswrapper[4848]: E1206 15:44:04.798674 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" containerName="registry-server" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.798679 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" containerName="registry-server" Dec 06 15:44:04 crc kubenswrapper[4848]: E1206 15:44:04.798687 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" containerName="extract-content" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.798715 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" containerName="extract-content" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.798814 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2363ddf7-2521-46fc-b2a3-240626ae9b6d" containerName="registry-server" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.798831 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd43c1db-7add-4e59-ad28-95d13ff32f1a" containerName="registry-server" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.799870 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.805002 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-58t5b" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.816652 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg"] Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.845718 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdwxf\" (UniqueName: \"kubernetes.io/projected/0a2e359c-0d23-4b5f-a484-9d010361a7dd-kube-api-access-tdwxf\") pod \"aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.845785 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-util\") pod \"aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.845841 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-bundle\") pod \"aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.946632 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdwxf\" (UniqueName: \"kubernetes.io/projected/0a2e359c-0d23-4b5f-a484-9d010361a7dd-kube-api-access-tdwxf\") pod \"aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.946690 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-util\") pod \"aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.946747 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-bundle\") pod \"aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.947208 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-util\") pod \"aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.947274 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-bundle\") pod \"aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:04 crc kubenswrapper[4848]: I1206 15:44:04.964795 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdwxf\" (UniqueName: \"kubernetes.io/projected/0a2e359c-0d23-4b5f-a484-9d010361a7dd-kube-api-access-tdwxf\") pod \"aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:05 crc kubenswrapper[4848]: I1206 15:44:05.124934 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:05 crc kubenswrapper[4848]: I1206 15:44:05.513377 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg"] Dec 06 15:44:05 crc kubenswrapper[4848]: W1206 15:44:05.523214 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a2e359c_0d23_4b5f_a484_9d010361a7dd.slice/crio-ec9b6cad76c7262901cbede90bb522a391ca95e3e79807aa607a765e5aca5268 WatchSource:0}: Error finding container ec9b6cad76c7262901cbede90bb522a391ca95e3e79807aa607a765e5aca5268: Status 404 returned error can't find the container with id ec9b6cad76c7262901cbede90bb522a391ca95e3e79807aa607a765e5aca5268 Dec 06 15:44:06 crc kubenswrapper[4848]: I1206 15:44:06.510727 4848 generic.go:334] "Generic (PLEG): container finished" podID="0a2e359c-0d23-4b5f-a484-9d010361a7dd" containerID="2e4c0f1d222a622494ec1b6ec95b5dea67f25c8380c10f61d3f1c7e411da873c" exitCode=0 Dec 06 15:44:06 crc kubenswrapper[4848]: I1206 15:44:06.510764 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" event={"ID":"0a2e359c-0d23-4b5f-a484-9d010361a7dd","Type":"ContainerDied","Data":"2e4c0f1d222a622494ec1b6ec95b5dea67f25c8380c10f61d3f1c7e411da873c"} Dec 06 15:44:06 crc kubenswrapper[4848]: I1206 15:44:06.510798 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" event={"ID":"0a2e359c-0d23-4b5f-a484-9d010361a7dd","Type":"ContainerStarted","Data":"ec9b6cad76c7262901cbede90bb522a391ca95e3e79807aa607a765e5aca5268"} Dec 06 15:44:07 crc kubenswrapper[4848]: I1206 15:44:07.516467 4848 generic.go:334] "Generic (PLEG): container finished" podID="0a2e359c-0d23-4b5f-a484-9d010361a7dd" containerID="316e15c291c76bb83ee241aa6f5ed2f2fc387bf53a9d5994a644852f0096eaaa" exitCode=0 Dec 06 15:44:07 crc kubenswrapper[4848]: I1206 15:44:07.516758 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" event={"ID":"0a2e359c-0d23-4b5f-a484-9d010361a7dd","Type":"ContainerDied","Data":"316e15c291c76bb83ee241aa6f5ed2f2fc387bf53a9d5994a644852f0096eaaa"} Dec 06 15:44:08 crc kubenswrapper[4848]: I1206 15:44:08.526655 4848 generic.go:334] "Generic (PLEG): container finished" podID="0a2e359c-0d23-4b5f-a484-9d010361a7dd" containerID="5185a323fd8b5f39c6138214be7900955107070d98187527ab180490cb6bea9c" exitCode=0 Dec 06 15:44:08 crc kubenswrapper[4848]: I1206 15:44:08.526692 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" event={"ID":"0a2e359c-0d23-4b5f-a484-9d010361a7dd","Type":"ContainerDied","Data":"5185a323fd8b5f39c6138214be7900955107070d98187527ab180490cb6bea9c"} Dec 06 15:44:09 crc kubenswrapper[4848]: I1206 15:44:09.782855 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:09 crc kubenswrapper[4848]: I1206 15:44:09.832049 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-util\") pod \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " Dec 06 15:44:09 crc kubenswrapper[4848]: I1206 15:44:09.832103 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdwxf\" (UniqueName: \"kubernetes.io/projected/0a2e359c-0d23-4b5f-a484-9d010361a7dd-kube-api-access-tdwxf\") pod \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " Dec 06 15:44:09 crc kubenswrapper[4848]: I1206 15:44:09.832207 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-bundle\") pod \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\" (UID: \"0a2e359c-0d23-4b5f-a484-9d010361a7dd\") " Dec 06 15:44:09 crc kubenswrapper[4848]: I1206 15:44:09.833272 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-bundle" (OuterVolumeSpecName: "bundle") pod "0a2e359c-0d23-4b5f-a484-9d010361a7dd" (UID: "0a2e359c-0d23-4b5f-a484-9d010361a7dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:44:09 crc kubenswrapper[4848]: I1206 15:44:09.854904 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-util" (OuterVolumeSpecName: "util") pod "0a2e359c-0d23-4b5f-a484-9d010361a7dd" (UID: "0a2e359c-0d23-4b5f-a484-9d010361a7dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:44:09 crc kubenswrapper[4848]: I1206 15:44:09.864283 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a2e359c-0d23-4b5f-a484-9d010361a7dd-kube-api-access-tdwxf" (OuterVolumeSpecName: "kube-api-access-tdwxf") pod "0a2e359c-0d23-4b5f-a484-9d010361a7dd" (UID: "0a2e359c-0d23-4b5f-a484-9d010361a7dd"). InnerVolumeSpecName "kube-api-access-tdwxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:44:09 crc kubenswrapper[4848]: I1206 15:44:09.933650 4848 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:44:09 crc kubenswrapper[4848]: I1206 15:44:09.933679 4848 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a2e359c-0d23-4b5f-a484-9d010361a7dd-util\") on node \"crc\" DevicePath \"\"" Dec 06 15:44:09 crc kubenswrapper[4848]: I1206 15:44:09.933688 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdwxf\" (UniqueName: \"kubernetes.io/projected/0a2e359c-0d23-4b5f-a484-9d010361a7dd-kube-api-access-tdwxf\") on node \"crc\" DevicePath \"\"" Dec 06 15:44:10 crc kubenswrapper[4848]: I1206 15:44:10.545239 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" event={"ID":"0a2e359c-0d23-4b5f-a484-9d010361a7dd","Type":"ContainerDied","Data":"ec9b6cad76c7262901cbede90bb522a391ca95e3e79807aa607a765e5aca5268"} Dec 06 15:44:10 crc kubenswrapper[4848]: I1206 15:44:10.545275 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec9b6cad76c7262901cbede90bb522a391ca95e3e79807aa607a765e5aca5268" Dec 06 15:44:10 crc kubenswrapper[4848]: I1206 15:44:10.545346 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg" Dec 06 15:44:13 crc kubenswrapper[4848]: I1206 15:44:13.950585 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4"] Dec 06 15:44:13 crc kubenswrapper[4848]: E1206 15:44:13.951496 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2e359c-0d23-4b5f-a484-9d010361a7dd" containerName="pull" Dec 06 15:44:13 crc kubenswrapper[4848]: I1206 15:44:13.951513 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2e359c-0d23-4b5f-a484-9d010361a7dd" containerName="pull" Dec 06 15:44:13 crc kubenswrapper[4848]: E1206 15:44:13.951527 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2e359c-0d23-4b5f-a484-9d010361a7dd" containerName="util" Dec 06 15:44:13 crc kubenswrapper[4848]: I1206 15:44:13.951537 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2e359c-0d23-4b5f-a484-9d010361a7dd" containerName="util" Dec 06 15:44:13 crc kubenswrapper[4848]: E1206 15:44:13.951555 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2e359c-0d23-4b5f-a484-9d010361a7dd" containerName="extract" Dec 06 15:44:13 crc kubenswrapper[4848]: I1206 15:44:13.951563 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2e359c-0d23-4b5f-a484-9d010361a7dd" containerName="extract" Dec 06 15:44:13 crc kubenswrapper[4848]: I1206 15:44:13.951723 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2e359c-0d23-4b5f-a484-9d010361a7dd" containerName="extract" Dec 06 15:44:13 crc kubenswrapper[4848]: I1206 15:44:13.952223 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4" Dec 06 15:44:13 crc kubenswrapper[4848]: I1206 15:44:13.955064 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-t7s7g" Dec 06 15:44:13 crc kubenswrapper[4848]: I1206 15:44:13.973135 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4"] Dec 06 15:44:14 crc kubenswrapper[4848]: I1206 15:44:14.064001 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m96nc\" (UniqueName: \"kubernetes.io/projected/e10571bc-0d25-47b1-bcd1-272c89db1fb6-kube-api-access-m96nc\") pod \"openstack-operator-controller-operator-865d7c46f-s58q4\" (UID: \"e10571bc-0d25-47b1-bcd1-272c89db1fb6\") " pod="openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4" Dec 06 15:44:14 crc kubenswrapper[4848]: I1206 15:44:14.164870 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m96nc\" (UniqueName: \"kubernetes.io/projected/e10571bc-0d25-47b1-bcd1-272c89db1fb6-kube-api-access-m96nc\") pod \"openstack-operator-controller-operator-865d7c46f-s58q4\" (UID: \"e10571bc-0d25-47b1-bcd1-272c89db1fb6\") " pod="openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4" Dec 06 15:44:14 crc kubenswrapper[4848]: I1206 15:44:14.184806 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m96nc\" (UniqueName: \"kubernetes.io/projected/e10571bc-0d25-47b1-bcd1-272c89db1fb6-kube-api-access-m96nc\") pod \"openstack-operator-controller-operator-865d7c46f-s58q4\" (UID: \"e10571bc-0d25-47b1-bcd1-272c89db1fb6\") " pod="openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4" Dec 06 15:44:14 crc kubenswrapper[4848]: I1206 15:44:14.275005 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4" Dec 06 15:44:14 crc kubenswrapper[4848]: I1206 15:44:14.639502 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4"] Dec 06 15:44:14 crc kubenswrapper[4848]: I1206 15:44:14.674688 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4" event={"ID":"e10571bc-0d25-47b1-bcd1-272c89db1fb6","Type":"ContainerStarted","Data":"831c2bcebd7d3090d301d02fd6d56835042f9427c81b9ed0ab81669bad5da4f5"} Dec 06 15:44:17 crc kubenswrapper[4848]: I1206 15:44:17.150239 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:44:17 crc kubenswrapper[4848]: I1206 15:44:17.150830 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:44:18 crc kubenswrapper[4848]: I1206 15:44:18.707833 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4" event={"ID":"e10571bc-0d25-47b1-bcd1-272c89db1fb6","Type":"ContainerStarted","Data":"9de4bc3ce830ec0fb965eda3acd4949d774324c924d56caaf778704034fc8e75"} Dec 06 15:44:18 crc kubenswrapper[4848]: I1206 15:44:18.708213 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4" Dec 06 15:44:18 crc kubenswrapper[4848]: I1206 15:44:18.735545 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4" podStartSLOduration=2.2795020040000002 podStartE2EDuration="5.73552396s" podCreationTimestamp="2025-12-06 15:44:13 +0000 UTC" firstStartedPulling="2025-12-06 15:44:14.634449607 +0000 UTC m=+921.932460520" lastFinishedPulling="2025-12-06 15:44:18.090471563 +0000 UTC m=+925.388482476" observedRunningTime="2025-12-06 15:44:18.730386552 +0000 UTC m=+926.028397465" watchObservedRunningTime="2025-12-06 15:44:18.73552396 +0000 UTC m=+926.033534873" Dec 06 15:44:24 crc kubenswrapper[4848]: I1206 15:44:24.278879 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-865d7c46f-s58q4" Dec 06 15:44:43 crc kubenswrapper[4848]: I1206 15:44:43.922614 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9"] Dec 06 15:44:43 crc kubenswrapper[4848]: I1206 15:44:43.926555 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" Dec 06 15:44:43 crc kubenswrapper[4848]: I1206 15:44:43.932908 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6cg4n" Dec 06 15:44:43 crc kubenswrapper[4848]: I1206 15:44:43.944235 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnhzc\" (UniqueName: \"kubernetes.io/projected/706ced85-8889-45e9-bd15-1a2747a9de2e-kube-api-access-wnhzc\") pod \"barbican-operator-controller-manager-7d9dfd778-69cx9\" (UID: \"706ced85-8889-45e9-bd15-1a2747a9de2e\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" Dec 06 15:44:43 crc kubenswrapper[4848]: I1206 15:44:43.949859 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9"] Dec 06 15:44:43 crc kubenswrapper[4848]: I1206 15:44:43.963754 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm"] Dec 06 15:44:43 crc kubenswrapper[4848]: I1206 15:44:43.965932 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" Dec 06 15:44:43 crc kubenswrapper[4848]: I1206 15:44:43.983617 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5"] Dec 06 15:44:43 crc kubenswrapper[4848]: I1206 15:44:43.984489 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-k2c77" Dec 06 15:44:43 crc kubenswrapper[4848]: I1206 15:44:43.992362 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.004734 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.008432 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.012957 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-7cpnl" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.014088 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-m4jn8" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.034464 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.046137 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzg46\" (UniqueName: \"kubernetes.io/projected/1438e750-61e2-4c37-8d03-f22b8ebad123-kube-api-access-vzg46\") pod \"designate-operator-controller-manager-697fb699cf-rhsqm\" (UID: \"1438e750-61e2-4c37-8d03-f22b8ebad123\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.046250 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnzhz\" (UniqueName: \"kubernetes.io/projected/d99ce981-7f71-4636-94c7-c848830429f3-kube-api-access-dnzhz\") pod \"glance-operator-controller-manager-5697bb5779-dzsgv\" (UID: \"d99ce981-7f71-4636-94c7-c848830429f3\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.046320 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vqt4\" (UniqueName: \"kubernetes.io/projected/9418eec1-6430-4bb9-a7be-6ec83f61c629-kube-api-access-7vqt4\") pod \"cinder-operator-controller-manager-6c677c69b-plnw5\" (UID: \"9418eec1-6430-4bb9-a7be-6ec83f61c629\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.046392 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnhzc\" (UniqueName: \"kubernetes.io/projected/706ced85-8889-45e9-bd15-1a2747a9de2e-kube-api-access-wnhzc\") pod \"barbican-operator-controller-manager-7d9dfd778-69cx9\" (UID: \"706ced85-8889-45e9-bd15-1a2747a9de2e\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.048434 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.065202 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.067016 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.076253 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-ltmzs" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.078113 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnhzc\" (UniqueName: \"kubernetes.io/projected/706ced85-8889-45e9-bd15-1a2747a9de2e-kube-api-access-wnhzc\") pod \"barbican-operator-controller-manager-7d9dfd778-69cx9\" (UID: \"706ced85-8889-45e9-bd15-1a2747a9de2e\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.096318 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.108448 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.124752 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.125911 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.133282 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-tzbt6" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.147294 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzg46\" (UniqueName: \"kubernetes.io/projected/1438e750-61e2-4c37-8d03-f22b8ebad123-kube-api-access-vzg46\") pod \"designate-operator-controller-manager-697fb699cf-rhsqm\" (UID: \"1438e750-61e2-4c37-8d03-f22b8ebad123\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.147589 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tsxn\" (UniqueName: \"kubernetes.io/projected/fd64b532-b259-49e5-bd47-62d9d20b6a69-kube-api-access-9tsxn\") pod \"heat-operator-controller-manager-5f64f6f8bb-6vq9l\" (UID: \"fd64b532-b259-49e5-bd47-62d9d20b6a69\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.147758 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w5w9\" (UniqueName: \"kubernetes.io/projected/1b49aafe-c450-441c-ade7-0d87b868dc2a-kube-api-access-9w5w9\") pod \"horizon-operator-controller-manager-68c6d99b8f-jtn8k\" (UID: \"1b49aafe-c450-441c-ade7-0d87b868dc2a\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.147875 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnzhz\" (UniqueName: \"kubernetes.io/projected/d99ce981-7f71-4636-94c7-c848830429f3-kube-api-access-dnzhz\") pod \"glance-operator-controller-manager-5697bb5779-dzsgv\" (UID: \"d99ce981-7f71-4636-94c7-c848830429f3\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.148094 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vqt4\" (UniqueName: \"kubernetes.io/projected/9418eec1-6430-4bb9-a7be-6ec83f61c629-kube-api-access-7vqt4\") pod \"cinder-operator-controller-manager-6c677c69b-plnw5\" (UID: \"9418eec1-6430-4bb9-a7be-6ec83f61c629\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.164551 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.165846 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.168834 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9q2zl" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.169311 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.194565 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.202565 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzg46\" (UniqueName: \"kubernetes.io/projected/1438e750-61e2-4c37-8d03-f22b8ebad123-kube-api-access-vzg46\") pod \"designate-operator-controller-manager-697fb699cf-rhsqm\" (UID: \"1438e750-61e2-4c37-8d03-f22b8ebad123\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.206948 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnzhz\" (UniqueName: \"kubernetes.io/projected/d99ce981-7f71-4636-94c7-c848830429f3-kube-api-access-dnzhz\") pod \"glance-operator-controller-manager-5697bb5779-dzsgv\" (UID: \"d99ce981-7f71-4636-94c7-c848830429f3\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.222519 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.225308 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vqt4\" (UniqueName: \"kubernetes.io/projected/9418eec1-6430-4bb9-a7be-6ec83f61c629-kube-api-access-7vqt4\") pod \"cinder-operator-controller-manager-6c677c69b-plnw5\" (UID: \"9418eec1-6430-4bb9-a7be-6ec83f61c629\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.243904 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.245158 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.254237 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdxd6\" (UniqueName: \"kubernetes.io/projected/cf4f4d25-7fc2-411f-9e23-71171162f38a-kube-api-access-cdxd6\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.254355 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tsxn\" (UniqueName: \"kubernetes.io/projected/fd64b532-b259-49e5-bd47-62d9d20b6a69-kube-api-access-9tsxn\") pod \"heat-operator-controller-manager-5f64f6f8bb-6vq9l\" (UID: \"fd64b532-b259-49e5-bd47-62d9d20b6a69\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.254399 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w5w9\" (UniqueName: \"kubernetes.io/projected/1b49aafe-c450-441c-ade7-0d87b868dc2a-kube-api-access-9w5w9\") pod \"horizon-operator-controller-manager-68c6d99b8f-jtn8k\" (UID: \"1b49aafe-c450-441c-ade7-0d87b868dc2a\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.254430 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.259578 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7fjl2" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.267355 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.270745 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.298008 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.309659 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w5w9\" (UniqueName: \"kubernetes.io/projected/1b49aafe-c450-441c-ade7-0d87b868dc2a-kube-api-access-9w5w9\") pod \"horizon-operator-controller-manager-68c6d99b8f-jtn8k\" (UID: \"1b49aafe-c450-441c-ade7-0d87b868dc2a\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.309759 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.310957 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.319621 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-s2dmg" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.337688 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.349918 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.356845 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdxd6\" (UniqueName: \"kubernetes.io/projected/cf4f4d25-7fc2-411f-9e23-71171162f38a-kube-api-access-cdxd6\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.357002 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vrnj\" (UniqueName: \"kubernetes.io/projected/fee1c62f-115d-472d-8617-32a386cf06c2-kube-api-access-5vrnj\") pod \"keystone-operator-controller-manager-7765d96ddf-4gqwc\" (UID: \"fee1c62f-115d-472d-8617-32a386cf06c2\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.357052 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.357080 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq4kc\" (UniqueName: \"kubernetes.io/projected/eecbdda5-b888-4fb9-979d-66bb4d8ffcf4-kube-api-access-rq4kc\") pod \"ironic-operator-controller-manager-54476ccddc-74npj\" (UID: \"eecbdda5-b888-4fb9-979d-66bb4d8ffcf4\") " pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" Dec 06 15:44:44 crc kubenswrapper[4848]: E1206 15:44:44.358143 4848 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 15:44:44 crc kubenswrapper[4848]: E1206 15:44:44.358194 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert podName:cf4f4d25-7fc2-411f-9e23-71171162f38a nodeName:}" failed. No retries permitted until 2025-12-06 15:44:44.858174848 +0000 UTC m=+952.156185761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert") pod "infra-operator-controller-manager-78d48bff9d-4k55x" (UID: "cf4f4d25-7fc2-411f-9e23-71171162f38a") : secret "infra-operator-webhook-server-cert" not found Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.359757 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.360747 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.372401 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tsxn\" (UniqueName: \"kubernetes.io/projected/fd64b532-b259-49e5-bd47-62d9d20b6a69-kube-api-access-9tsxn\") pod \"heat-operator-controller-manager-5f64f6f8bb-6vq9l\" (UID: \"fd64b532-b259-49e5-bd47-62d9d20b6a69\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.373662 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-hnnqv" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.402315 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdxd6\" (UniqueName: \"kubernetes.io/projected/cf4f4d25-7fc2-411f-9e23-71171162f38a-kube-api-access-cdxd6\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.412418 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.423616 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.430832 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.432040 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.441091 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.448245 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.450770 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.460480 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qshqx\" (UniqueName: \"kubernetes.io/projected/76bfd2a6-6774-4992-91ec-c73327b11bd8-kube-api-access-qshqx\") pod \"manila-operator-controller-manager-5b5fd79c9c-4vwqn\" (UID: \"76bfd2a6-6774-4992-91ec-c73327b11bd8\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.460540 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vrnj\" (UniqueName: \"kubernetes.io/projected/fee1c62f-115d-472d-8617-32a386cf06c2-kube-api-access-5vrnj\") pod \"keystone-operator-controller-manager-7765d96ddf-4gqwc\" (UID: \"fee1c62f-115d-472d-8617-32a386cf06c2\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.460574 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq4kc\" (UniqueName: \"kubernetes.io/projected/eecbdda5-b888-4fb9-979d-66bb4d8ffcf4-kube-api-access-rq4kc\") pod \"ironic-operator-controller-manager-54476ccddc-74npj\" (UID: \"eecbdda5-b888-4fb9-979d-66bb4d8ffcf4\") " pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.460595 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dl8d\" (UniqueName: \"kubernetes.io/projected/95e14baa-5bbb-4bf6-9420-0428c25cc98f-kube-api-access-7dl8d\") pod \"mariadb-operator-controller-manager-79c8c4686c-9vx7t\" (UID: \"95e14baa-5bbb-4bf6-9420-0428c25cc98f\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.467763 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.469041 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.474908 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-xmqd6" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.479111 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-q48p8" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.481806 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.483959 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.499473 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.540064 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-js7m7" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.555974 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.559321 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.559428 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.569731 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-jrrqp" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.571248 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq4kc\" (UniqueName: \"kubernetes.io/projected/eecbdda5-b888-4fb9-979d-66bb4d8ffcf4-kube-api-access-rq4kc\") pod \"ironic-operator-controller-manager-54476ccddc-74npj\" (UID: \"eecbdda5-b888-4fb9-979d-66bb4d8ffcf4\") " pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.578130 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-949b5\" (UniqueName: \"kubernetes.io/projected/7c78ca28-a1c2-45b9-9a14-733aae9ee555-kube-api-access-949b5\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vpn6w\" (UID: \"7c78ca28-a1c2-45b9-9a14-733aae9ee555\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.578191 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qshqx\" (UniqueName: \"kubernetes.io/projected/76bfd2a6-6774-4992-91ec-c73327b11bd8-kube-api-access-qshqx\") pod \"manila-operator-controller-manager-5b5fd79c9c-4vwqn\" (UID: \"76bfd2a6-6774-4992-91ec-c73327b11bd8\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.578255 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dl8d\" (UniqueName: \"kubernetes.io/projected/95e14baa-5bbb-4bf6-9420-0428c25cc98f-kube-api-access-7dl8d\") pod \"mariadb-operator-controller-manager-79c8c4686c-9vx7t\" (UID: \"95e14baa-5bbb-4bf6-9420-0428c25cc98f\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.578321 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc7jw\" (UniqueName: \"kubernetes.io/projected/fa68cc1e-18ec-42d1-a3de-948ef2cc0804-kube-api-access-fc7jw\") pod \"nova-operator-controller-manager-697bc559fc-s6hbc\" (UID: \"fa68cc1e-18ec-42d1-a3de-948ef2cc0804\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.616689 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.618910 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vrnj\" (UniqueName: \"kubernetes.io/projected/fee1c62f-115d-472d-8617-32a386cf06c2-kube-api-access-5vrnj\") pod \"keystone-operator-controller-manager-7765d96ddf-4gqwc\" (UID: \"fee1c62f-115d-472d-8617-32a386cf06c2\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.649169 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.666820 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.667746 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qshqx\" (UniqueName: \"kubernetes.io/projected/76bfd2a6-6774-4992-91ec-c73327b11bd8-kube-api-access-qshqx\") pod \"manila-operator-controller-manager-5b5fd79c9c-4vwqn\" (UID: \"76bfd2a6-6774-4992-91ec-c73327b11bd8\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.694464 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dl8d\" (UniqueName: \"kubernetes.io/projected/95e14baa-5bbb-4bf6-9420-0428c25cc98f-kube-api-access-7dl8d\") pod \"mariadb-operator-controller-manager-79c8c4686c-9vx7t\" (UID: \"95e14baa-5bbb-4bf6-9420-0428c25cc98f\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.694574 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.698653 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.700336 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-t8w9l" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.706238 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.707633 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.713531 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.716426 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-949b5\" (UniqueName: \"kubernetes.io/projected/7c78ca28-a1c2-45b9-9a14-733aae9ee555-kube-api-access-949b5\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vpn6w\" (UID: \"7c78ca28-a1c2-45b9-9a14-733aae9ee555\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.716481 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgs46\" (UniqueName: \"kubernetes.io/projected/dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd-kube-api-access-kgs46\") pod \"ovn-operator-controller-manager-b6456fdb6-x7d2b\" (UID: \"dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.716534 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drr8z\" (UniqueName: \"kubernetes.io/projected/0d42fecd-1b6d-4f29-816c-38515c0a547c-kube-api-access-drr8z\") pod \"octavia-operator-controller-manager-998648c74-9zcn8\" (UID: \"0d42fecd-1b6d-4f29-816c-38515c0a547c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.716569 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc7jw\" (UniqueName: \"kubernetes.io/projected/fa68cc1e-18ec-42d1-a3de-948ef2cc0804-kube-api-access-fc7jw\") pod \"nova-operator-controller-manager-697bc559fc-s6hbc\" (UID: \"fa68cc1e-18ec-42d1-a3de-948ef2cc0804\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.720215 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-vj9hq" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.720654 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.741240 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc7jw\" (UniqueName: \"kubernetes.io/projected/fa68cc1e-18ec-42d1-a3de-948ef2cc0804-kube-api-access-fc7jw\") pod \"nova-operator-controller-manager-697bc559fc-s6hbc\" (UID: \"fa68cc1e-18ec-42d1-a3de-948ef2cc0804\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.743885 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-949b5\" (UniqueName: \"kubernetes.io/projected/7c78ca28-a1c2-45b9-9a14-733aae9ee555-kube-api-access-949b5\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vpn6w\" (UID: \"7c78ca28-a1c2-45b9-9a14-733aae9ee555\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.761458 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.776558 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.781522 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.785072 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bhjwt" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.805400 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.817769 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgs46\" (UniqueName: \"kubernetes.io/projected/dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd-kube-api-access-kgs46\") pod \"ovn-operator-controller-manager-b6456fdb6-x7d2b\" (UID: \"dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.817818 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.817858 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drr8z\" (UniqueName: \"kubernetes.io/projected/0d42fecd-1b6d-4f29-816c-38515c0a547c-kube-api-access-drr8z\") pod \"octavia-operator-controller-manager-998648c74-9zcn8\" (UID: \"0d42fecd-1b6d-4f29-816c-38515c0a547c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.817889 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g7lp\" (UniqueName: \"kubernetes.io/projected/90274bce-5d78-4f62-8265-2218bc58916d-kube-api-access-4g7lp\") pod \"placement-operator-controller-manager-78f8948974-bxhmf\" (UID: \"90274bce-5d78-4f62-8265-2218bc58916d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.817919 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx8f2\" (UniqueName: \"kubernetes.io/projected/ec6dd72c-05cb-49f7-af2a-01a76807175c-kube-api-access-fx8f2\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.818345 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.840186 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drr8z\" (UniqueName: \"kubernetes.io/projected/0d42fecd-1b6d-4f29-816c-38515c0a547c-kube-api-access-drr8z\") pod \"octavia-operator-controller-manager-998648c74-9zcn8\" (UID: \"0d42fecd-1b6d-4f29-816c-38515c0a547c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.848334 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.849919 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.850619 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.854208 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-mgjw6" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.854773 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgs46\" (UniqueName: \"kubernetes.io/projected/dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd-kube-api-access-kgs46\") pod \"ovn-operator-controller-manager-b6456fdb6-x7d2b\" (UID: \"dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.865451 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.873821 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.875027 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.882871 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-6bp7w" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.889977 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.893064 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.895070 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.898032 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8zrkq" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.901586 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.903086 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.906814 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-krb58" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.914798 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.919005 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dlt7\" (UniqueName: \"kubernetes.io/projected/f59ce483-133a-495a-862c-3676661adab8-kube-api-access-5dlt7\") pod \"test-operator-controller-manager-5854674fcc-x7tzd\" (UID: \"f59ce483-133a-495a-862c-3676661adab8\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.919072 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.919111 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.919168 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g7lp\" (UniqueName: \"kubernetes.io/projected/90274bce-5d78-4f62-8265-2218bc58916d-kube-api-access-4g7lp\") pod \"placement-operator-controller-manager-78f8948974-bxhmf\" (UID: \"90274bce-5d78-4f62-8265-2218bc58916d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.919221 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx8f2\" (UniqueName: \"kubernetes.io/projected/ec6dd72c-05cb-49f7-af2a-01a76807175c-kube-api-access-fx8f2\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.919277 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njwzm\" (UniqueName: \"kubernetes.io/projected/cf3406a1-eb72-4251-bbe3-45a33235ac96-kube-api-access-njwzm\") pod \"telemetry-operator-controller-manager-58d5ff84df-cwth7\" (UID: \"cf3406a1-eb72-4251-bbe3-45a33235ac96\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.919322 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2kdv\" (UniqueName: \"kubernetes.io/projected/d2b8bc9b-a359-4fe9-a039-200eed4f7218-kube-api-access-c2kdv\") pod \"swift-operator-controller-manager-9d58d64bc-5h8p9\" (UID: \"d2b8bc9b-a359-4fe9-a039-200eed4f7218\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" Dec 06 15:44:44 crc kubenswrapper[4848]: E1206 15:44:44.919355 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:44:44 crc kubenswrapper[4848]: E1206 15:44:44.919421 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert podName:ec6dd72c-05cb-49f7-af2a-01a76807175c nodeName:}" failed. No retries permitted until 2025-12-06 15:44:45.419402325 +0000 UTC m=+952.717413238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fgtgmh" (UID: "ec6dd72c-05cb-49f7-af2a-01a76807175c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:44:44 crc kubenswrapper[4848]: E1206 15:44:44.920572 4848 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 15:44:44 crc kubenswrapper[4848]: E1206 15:44:44.920638 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert podName:cf4f4d25-7fc2-411f-9e23-71171162f38a nodeName:}" failed. No retries permitted until 2025-12-06 15:44:45.920617578 +0000 UTC m=+953.218628571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert") pod "infra-operator-controller-manager-78d48bff9d-4k55x" (UID: "cf4f4d25-7fc2-411f-9e23-71171162f38a") : secret "infra-operator-webhook-server-cert" not found Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.938953 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.939152 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd"] Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.939318 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.949745 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx8f2\" (UniqueName: \"kubernetes.io/projected/ec6dd72c-05cb-49f7-af2a-01a76807175c-kube-api-access-fx8f2\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.950637 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g7lp\" (UniqueName: \"kubernetes.io/projected/90274bce-5d78-4f62-8265-2218bc58916d-kube-api-access-4g7lp\") pod \"placement-operator-controller-manager-78f8948974-bxhmf\" (UID: \"90274bce-5d78-4f62-8265-2218bc58916d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf" Dec 06 15:44:44 crc kubenswrapper[4848]: I1206 15:44:44.966102 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.025749 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njwzm\" (UniqueName: \"kubernetes.io/projected/cf3406a1-eb72-4251-bbe3-45a33235ac96-kube-api-access-njwzm\") pod \"telemetry-operator-controller-manager-58d5ff84df-cwth7\" (UID: \"cf3406a1-eb72-4251-bbe3-45a33235ac96\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.025803 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2kdv\" (UniqueName: \"kubernetes.io/projected/d2b8bc9b-a359-4fe9-a039-200eed4f7218-kube-api-access-c2kdv\") pod \"swift-operator-controller-manager-9d58d64bc-5h8p9\" (UID: \"d2b8bc9b-a359-4fe9-a039-200eed4f7218\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.025836 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dlt7\" (UniqueName: \"kubernetes.io/projected/f59ce483-133a-495a-862c-3676661adab8-kube-api-access-5dlt7\") pod \"test-operator-controller-manager-5854674fcc-x7tzd\" (UID: \"f59ce483-133a-495a-862c-3676661adab8\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.025926 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sgch\" (UniqueName: \"kubernetes.io/projected/93dc2914-4eae-4bd4-a4c3-94122e44f908-kube-api-access-2sgch\") pod \"watcher-operator-controller-manager-667bd8d554-s6ttq\" (UID: \"93dc2914-4eae-4bd4-a4c3-94122e44f908\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.029260 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.033425 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.037955 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.038157 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.041184 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-jdc9c" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.044972 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.046325 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.053691 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dlt7\" (UniqueName: \"kubernetes.io/projected/f59ce483-133a-495a-862c-3676661adab8-kube-api-access-5dlt7\") pod \"test-operator-controller-manager-5854674fcc-x7tzd\" (UID: \"f59ce483-133a-495a-862c-3676661adab8\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.066942 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.067900 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.072117 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njwzm\" (UniqueName: \"kubernetes.io/projected/cf3406a1-eb72-4251-bbe3-45a33235ac96-kube-api-access-njwzm\") pod \"telemetry-operator-controller-manager-58d5ff84df-cwth7\" (UID: \"cf3406a1-eb72-4251-bbe3-45a33235ac96\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.075505 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-wrl7l" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.077904 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.078991 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2kdv\" (UniqueName: \"kubernetes.io/projected/d2b8bc9b-a359-4fe9-a039-200eed4f7218-kube-api-access-c2kdv\") pod \"swift-operator-controller-manager-9d58d64bc-5h8p9\" (UID: \"d2b8bc9b-a359-4fe9-a039-200eed4f7218\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.114202 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.139344 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvrzg\" (UniqueName: \"kubernetes.io/projected/deacbe3a-30ba-42bb-a180-f8e2360ba937-kube-api-access-pvrzg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pzg55\" (UID: \"deacbe3a-30ba-42bb-a180-f8e2360ba937\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.139410 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.139541 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.139636 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c8j5\" (UniqueName: \"kubernetes.io/projected/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-kube-api-access-4c8j5\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.139670 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sgch\" (UniqueName: \"kubernetes.io/projected/93dc2914-4eae-4bd4-a4c3-94122e44f908-kube-api-access-2sgch\") pod \"watcher-operator-controller-manager-667bd8d554-s6ttq\" (UID: \"93dc2914-4eae-4bd4-a4c3-94122e44f908\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.170303 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.177428 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sgch\" (UniqueName: \"kubernetes.io/projected/93dc2914-4eae-4bd4-a4c3-94122e44f908-kube-api-access-2sgch\") pod \"watcher-operator-controller-manager-667bd8d554-s6ttq\" (UID: \"93dc2914-4eae-4bd4-a4c3-94122e44f908\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.245908 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.246241 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.246323 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c8j5\" (UniqueName: \"kubernetes.io/projected/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-kube-api-access-4c8j5\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.246384 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvrzg\" (UniqueName: \"kubernetes.io/projected/deacbe3a-30ba-42bb-a180-f8e2360ba937-kube-api-access-pvrzg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pzg55\" (UID: \"deacbe3a-30ba-42bb-a180-f8e2360ba937\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.246898 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.246955 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs podName:bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826 nodeName:}" failed. No retries permitted until 2025-12-06 15:44:45.7469369 +0000 UTC m=+953.044947813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs") pod "openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" (UID: "bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826") : secret "metrics-server-cert" not found Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.247131 4848 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.247164 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs podName:bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826 nodeName:}" failed. No retries permitted until 2025-12-06 15:44:45.747154306 +0000 UTC m=+953.045165219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs") pod "openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" (UID: "bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826") : secret "webhook-server-cert" not found Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.248499 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.267058 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c8j5\" (UniqueName: \"kubernetes.io/projected/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-kube-api-access-4c8j5\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.269271 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvrzg\" (UniqueName: \"kubernetes.io/projected/deacbe3a-30ba-42bb-a180-f8e2360ba937-kube-api-access-pvrzg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pzg55\" (UID: \"deacbe3a-30ba-42bb-a180-f8e2360ba937\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.290002 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.315051 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.353652 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.397496 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.449133 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.449743 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.449820 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert podName:ec6dd72c-05cb-49f7-af2a-01a76807175c nodeName:}" failed. No retries permitted until 2025-12-06 15:44:46.449799547 +0000 UTC m=+953.747810530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fgtgmh" (UID: "ec6dd72c-05cb-49f7-af2a-01a76807175c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.568489 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.579907 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.736447 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.744795 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.758484 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.758622 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.758798 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.758852 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs podName:bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826 nodeName:}" failed. No retries permitted until 2025-12-06 15:44:46.758828231 +0000 UTC m=+954.056839144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs") pod "openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" (UID: "bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826") : secret "metrics-server-cert" not found Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.758896 4848 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.758917 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs podName:bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826 nodeName:}" failed. No retries permitted until 2025-12-06 15:44:46.758910823 +0000 UTC m=+954.056921736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs") pod "openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" (UID: "bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826") : secret "webhook-server-cert" not found Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.769812 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.775653 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.779785 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.784787 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.890253 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv" event={"ID":"d99ce981-7f71-4636-94c7-c848830429f3","Type":"ContainerStarted","Data":"23619df703feda6bd2d4cac887ab194b23135e8c75d206940a2e8b75c61e465c"} Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.893324 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn" event={"ID":"76bfd2a6-6774-4992-91ec-c73327b11bd8","Type":"ContainerStarted","Data":"a260ec5e380262d4594c0cfa1ed18d50ad3e30b0c669be74bda83dacce54d246"} Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.902899 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc" event={"ID":"fee1c62f-115d-472d-8617-32a386cf06c2","Type":"ContainerStarted","Data":"636ab128d909b94101e3ae1ecb4f6a4b165d6bba1b14cec4eeea65d1a88ed2c4"} Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.905380 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.907410 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" event={"ID":"1438e750-61e2-4c37-8d03-f22b8ebad123","Type":"ContainerStarted","Data":"188a8ee38f56d83cee010a17850fd7d879e3ee206f0ce1359e31b07cd553cd6c"} Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.913316 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.914008 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k" event={"ID":"1b49aafe-c450-441c-ade7-0d87b868dc2a","Type":"ContainerStarted","Data":"69a64bf3e0c198270ea62423487cd95b20fb8fe1da7c7fc85c406437c4ef75f7"} Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.921921 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.922992 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" event={"ID":"9418eec1-6430-4bb9-a7be-6ec83f61c629","Type":"ContainerStarted","Data":"f111b3e9c399ba4e64670b481ed7474468d2dceb81c673262cdbb7f2c37d1454"} Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.924612 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" event={"ID":"fd64b532-b259-49e5-bd47-62d9d20b6a69","Type":"ContainerStarted","Data":"62aba9cc907f84fc26aeb588abc8d4603849fd3747402c2da893ec9c0c925b1f"} Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.925876 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" event={"ID":"706ced85-8889-45e9-bd15-1a2747a9de2e","Type":"ContainerStarted","Data":"3fc197d353cb34133ce8aa0eeae130840feaf666428168e24bf84ea097b1327d"} Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.926631 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" event={"ID":"eecbdda5-b888-4fb9-979d-66bb4d8ffcf4","Type":"ContainerStarted","Data":"16a5a9458c9a10f7d58a71b30540130c54a97a657ba5197c609bb8dd0a6bdcc7"} Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.927894 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.932043 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.935945 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t"] Dec 06 15:44:45 crc kubenswrapper[4848]: I1206 15:44:45.961778 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.961934 4848 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 15:44:45 crc kubenswrapper[4848]: E1206 15:44:45.962011 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert podName:cf4f4d25-7fc2-411f-9e23-71171162f38a nodeName:}" failed. No retries permitted until 2025-12-06 15:44:47.961988366 +0000 UTC m=+955.259999279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert") pod "infra-operator-controller-manager-78d48bff9d-4k55x" (UID: "cf4f4d25-7fc2-411f-9e23-71171162f38a") : secret "infra-operator-webhook-server-cert" not found Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.099001 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd"] Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.105423 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7"] Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.111808 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55"] Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.122458 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9"] Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.127767 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq"] Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.468829 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.468973 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.469192 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert podName:ec6dd72c-05cb-49f7-af2a-01a76807175c nodeName:}" failed. No retries permitted until 2025-12-06 15:44:48.469176548 +0000 UTC m=+955.767187461 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fgtgmh" (UID: "ec6dd72c-05cb-49f7-af2a-01a76807175c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.791229 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.791668 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.791802 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.791853 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs podName:bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826 nodeName:}" failed. No retries permitted until 2025-12-06 15:44:48.791839231 +0000 UTC m=+956.089850144 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs") pod "openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" (UID: "bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826") : secret "metrics-server-cert" not found Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.791867 4848 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.791957 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs podName:bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826 nodeName:}" failed. No retries permitted until 2025-12-06 15:44:48.791929783 +0000 UTC m=+956.089940786 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs") pod "openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" (UID: "bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826") : secret "webhook-server-cert" not found Dec 06 15:44:46 crc kubenswrapper[4848]: W1206 15:44:46.850445 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf3406a1_eb72_4251_bbe3_45a33235ac96.slice/crio-3eb43df06bb8644e8cee42f97199b49f3786c82d6b32ed70e234234a300f8003 WatchSource:0}: Error finding container 3eb43df06bb8644e8cee42f97199b49f3786c82d6b32ed70e234234a300f8003: Status 404 returned error can't find the container with id 3eb43df06bb8644e8cee42f97199b49f3786c82d6b32ed70e234234a300f8003 Dec 06 15:44:46 crc kubenswrapper[4848]: W1206 15:44:46.867437 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeacbe3a_30ba_42bb_a180_f8e2360ba937.slice/crio-aecbd80f03dae83110aff0ed78c22abcb4fe2984531d9862a5eaca12d539a504 WatchSource:0}: Error finding container aecbd80f03dae83110aff0ed78c22abcb4fe2984531d9862a5eaca12d539a504: Status 404 returned error can't find the container with id aecbd80f03dae83110aff0ed78c22abcb4fe2984531d9862a5eaca12d539a504 Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.874277 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c2kdv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-5h8p9_openstack-operators(d2b8bc9b-a359-4fe9-a039-200eed4f7218): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 15:44:46 crc kubenswrapper[4848]: W1206 15:44:46.876377 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93dc2914_4eae_4bd4_a4c3_94122e44f908.slice/crio-62ec62d49a163a74a2866c290f4a116f6e8b57d215de5cd2df2798231af27ba3 WatchSource:0}: Error finding container 62ec62d49a163a74a2866c290f4a116f6e8b57d215de5cd2df2798231af27ba3: Status 404 returned error can't find the container with id 62ec62d49a163a74a2866c290f4a116f6e8b57d215de5cd2df2798231af27ba3 Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.882885 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c2kdv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-5h8p9_openstack-operators(d2b8bc9b-a359-4fe9-a039-200eed4f7218): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.884242 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" podUID="d2b8bc9b-a359-4fe9-a039-200eed4f7218" Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.885093 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2sgch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-667bd8d554-s6ttq_openstack-operators(93dc2914-4eae-4bd4-a4c3-94122e44f908): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.886219 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pvrzg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pzg55_openstack-operators(deacbe3a-30ba-42bb-a180-f8e2360ba937): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.887139 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2sgch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-667bd8d554-s6ttq_openstack-operators(93dc2914-4eae-4bd4-a4c3-94122e44f908): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.887377 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" podUID="deacbe3a-30ba-42bb-a180-f8e2360ba937" Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.888229 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" podUID="93dc2914-4eae-4bd4-a4c3-94122e44f908" Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.935641 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc" event={"ID":"fa68cc1e-18ec-42d1-a3de-948ef2cc0804","Type":"ContainerStarted","Data":"15fb032136e004116db6f6974e0e7c11f98beadf6d8fd8c5eeacbd540d605faa"} Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.937383 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8" event={"ID":"0d42fecd-1b6d-4f29-816c-38515c0a547c","Type":"ContainerStarted","Data":"ea95185afba7ddd32acba6263cb2a59f677c02b73f363d2369b5b55fe85135d0"} Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.938724 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" event={"ID":"93dc2914-4eae-4bd4-a4c3-94122e44f908","Type":"ContainerStarted","Data":"62ec62d49a163a74a2866c290f4a116f6e8b57d215de5cd2df2798231af27ba3"} Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.941585 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" podUID="93dc2914-4eae-4bd4-a4c3-94122e44f908" Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.941734 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w" event={"ID":"7c78ca28-a1c2-45b9-9a14-733aae9ee555","Type":"ContainerStarted","Data":"2373b686663c91bb4050c79762334dd809e323a3936202374f9d77b2f442a596"} Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.944538 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" event={"ID":"95e14baa-5bbb-4bf6-9420-0428c25cc98f","Type":"ContainerStarted","Data":"8f9fc46d4820326bc406aad40ba5f56314bbb881c976c058e88f8a503aa95432"} Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.945938 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" event={"ID":"deacbe3a-30ba-42bb-a180-f8e2360ba937","Type":"ContainerStarted","Data":"aecbd80f03dae83110aff0ed78c22abcb4fe2984531d9862a5eaca12d539a504"} Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.947266 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" podUID="deacbe3a-30ba-42bb-a180-f8e2360ba937" Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.947977 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b" event={"ID":"dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd","Type":"ContainerStarted","Data":"ec0e590a7d5de47db04ddd8c8e5263662d7da16cb5a1e0aca20bb9e13ceb276a"} Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.948831 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" event={"ID":"d2b8bc9b-a359-4fe9-a039-200eed4f7218","Type":"ContainerStarted","Data":"a130d61a41bae10723f97c303dc20756c02b4024363d459fe0e0763eb7b1c672"} Dec 06 15:44:46 crc kubenswrapper[4848]: E1206 15:44:46.950557 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" podUID="d2b8bc9b-a359-4fe9-a039-200eed4f7218" Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.950669 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7" event={"ID":"cf3406a1-eb72-4251-bbe3-45a33235ac96","Type":"ContainerStarted","Data":"3eb43df06bb8644e8cee42f97199b49f3786c82d6b32ed70e234234a300f8003"} Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.951924 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf" event={"ID":"90274bce-5d78-4f62-8265-2218bc58916d","Type":"ContainerStarted","Data":"5d593ad9896e8a50753cc2d28de8535a2be1a90cf9726dfee18bbdbe0258069e"} Dec 06 15:44:46 crc kubenswrapper[4848]: I1206 15:44:46.952781 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd" event={"ID":"f59ce483-133a-495a-862c-3676661adab8","Type":"ContainerStarted","Data":"49c01be40d2d00c4303ce56c5c96b15580a27b48f56dbf6122703625b66c4612"} Dec 06 15:44:47 crc kubenswrapper[4848]: I1206 15:44:47.150428 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:44:47 crc kubenswrapper[4848]: I1206 15:44:47.150491 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:44:47 crc kubenswrapper[4848]: I1206 15:44:47.150535 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:44:47 crc kubenswrapper[4848]: I1206 15:44:47.151210 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c6dce4a805c82f5f7db3f50f5c57941411fd68b7c39c5fc92171551376370cc"} pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 15:44:47 crc kubenswrapper[4848]: I1206 15:44:47.151275 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" containerID="cri-o://0c6dce4a805c82f5f7db3f50f5c57941411fd68b7c39c5fc92171551376370cc" gracePeriod=600 Dec 06 15:44:47 crc kubenswrapper[4848]: I1206 15:44:47.987774 4848 generic.go:334] "Generic (PLEG): container finished" podID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerID="0c6dce4a805c82f5f7db3f50f5c57941411fd68b7c39c5fc92171551376370cc" exitCode=0 Dec 06 15:44:47 crc kubenswrapper[4848]: I1206 15:44:47.987939 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerDied","Data":"0c6dce4a805c82f5f7db3f50f5c57941411fd68b7c39c5fc92171551376370cc"} Dec 06 15:44:47 crc kubenswrapper[4848]: I1206 15:44:47.988196 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"7145cee45c506bc8604a623c0766622691ca486056cd069a6687b453e59facaa"} Dec 06 15:44:47 crc kubenswrapper[4848]: I1206 15:44:47.988229 4848 scope.go:117] "RemoveContainer" containerID="e63fbea36a0e0bb825a9969be1380c579a7bc59e5ffe70c4e6a4da495e1853d8" Dec 06 15:44:48 crc kubenswrapper[4848]: E1206 15:44:48.000464 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" podUID="d2b8bc9b-a359-4fe9-a039-200eed4f7218" Dec 06 15:44:48 crc kubenswrapper[4848]: E1206 15:44:48.003400 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" podUID="93dc2914-4eae-4bd4-a4c3-94122e44f908" Dec 06 15:44:48 crc kubenswrapper[4848]: E1206 15:44:48.002432 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" podUID="deacbe3a-30ba-42bb-a180-f8e2360ba937" Dec 06 15:44:48 crc kubenswrapper[4848]: I1206 15:44:48.024647 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:44:48 crc kubenswrapper[4848]: E1206 15:44:48.025360 4848 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 15:44:48 crc kubenswrapper[4848]: E1206 15:44:48.025463 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert podName:cf4f4d25-7fc2-411f-9e23-71171162f38a nodeName:}" failed. No retries permitted until 2025-12-06 15:44:52.025426116 +0000 UTC m=+959.323437029 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert") pod "infra-operator-controller-manager-78d48bff9d-4k55x" (UID: "cf4f4d25-7fc2-411f-9e23-71171162f38a") : secret "infra-operator-webhook-server-cert" not found Dec 06 15:44:48 crc kubenswrapper[4848]: I1206 15:44:48.533964 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:44:48 crc kubenswrapper[4848]: E1206 15:44:48.534191 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:44:48 crc kubenswrapper[4848]: E1206 15:44:48.534398 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert podName:ec6dd72c-05cb-49f7-af2a-01a76807175c nodeName:}" failed. No retries permitted until 2025-12-06 15:44:52.534376877 +0000 UTC m=+959.832387790 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fgtgmh" (UID: "ec6dd72c-05cb-49f7-af2a-01a76807175c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:44:48 crc kubenswrapper[4848]: I1206 15:44:48.837906 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:48 crc kubenswrapper[4848]: I1206 15:44:48.838072 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:48 crc kubenswrapper[4848]: E1206 15:44:48.838226 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 15:44:48 crc kubenswrapper[4848]: E1206 15:44:48.838440 4848 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 15:44:48 crc kubenswrapper[4848]: E1206 15:44:48.838503 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs podName:bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826 nodeName:}" failed. No retries permitted until 2025-12-06 15:44:52.838484037 +0000 UTC m=+960.136494940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs") pod "openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" (UID: "bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826") : secret "webhook-server-cert" not found Dec 06 15:44:48 crc kubenswrapper[4848]: E1206 15:44:48.838908 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs podName:bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826 nodeName:}" failed. No retries permitted until 2025-12-06 15:44:52.838893448 +0000 UTC m=+960.136904361 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs") pod "openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" (UID: "bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826") : secret "metrics-server-cert" not found Dec 06 15:44:52 crc kubenswrapper[4848]: I1206 15:44:52.090571 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:44:52 crc kubenswrapper[4848]: E1206 15:44:52.091589 4848 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 15:44:52 crc kubenswrapper[4848]: E1206 15:44:52.092107 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert podName:cf4f4d25-7fc2-411f-9e23-71171162f38a nodeName:}" failed. No retries permitted until 2025-12-06 15:45:00.092089067 +0000 UTC m=+967.390099980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert") pod "infra-operator-controller-manager-78d48bff9d-4k55x" (UID: "cf4f4d25-7fc2-411f-9e23-71171162f38a") : secret "infra-operator-webhook-server-cert" not found Dec 06 15:44:52 crc kubenswrapper[4848]: I1206 15:44:52.598792 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:44:52 crc kubenswrapper[4848]: E1206 15:44:52.598951 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:44:52 crc kubenswrapper[4848]: E1206 15:44:52.599016 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert podName:ec6dd72c-05cb-49f7-af2a-01a76807175c nodeName:}" failed. No retries permitted until 2025-12-06 15:45:00.598998812 +0000 UTC m=+967.897009725 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fgtgmh" (UID: "ec6dd72c-05cb-49f7-af2a-01a76807175c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:44:52 crc kubenswrapper[4848]: I1206 15:44:52.902458 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:52 crc kubenswrapper[4848]: I1206 15:44:52.902516 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:44:52 crc kubenswrapper[4848]: E1206 15:44:52.902627 4848 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 15:44:52 crc kubenswrapper[4848]: E1206 15:44:52.902675 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 15:44:52 crc kubenswrapper[4848]: E1206 15:44:52.902735 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs podName:bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826 nodeName:}" failed. No retries permitted until 2025-12-06 15:45:00.902676071 +0000 UTC m=+968.200686984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs") pod "openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" (UID: "bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826") : secret "webhook-server-cert" not found Dec 06 15:44:52 crc kubenswrapper[4848]: E1206 15:44:52.902794 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs podName:bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826 nodeName:}" failed. No retries permitted until 2025-12-06 15:45:00.902772463 +0000 UTC m=+968.200783456 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs") pod "openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" (UID: "bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826") : secret "metrics-server-cert" not found Dec 06 15:44:59 crc kubenswrapper[4848]: I1206 15:44:59.084417 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn" event={"ID":"76bfd2a6-6774-4992-91ec-c73327b11bd8","Type":"ContainerStarted","Data":"0b959d3879992b94fa53e94375f8c9f4523565f8cfe9aa357dc2f5e12539d4a4"} Dec 06 15:44:59 crc kubenswrapper[4848]: I1206 15:44:59.110510 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8" event={"ID":"0d42fecd-1b6d-4f29-816c-38515c0a547c","Type":"ContainerStarted","Data":"8664bfb402290b51358ee109fed1017e93d6685d662d95d985837194ef60da79"} Dec 06 15:44:59 crc kubenswrapper[4848]: I1206 15:44:59.115832 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w" event={"ID":"7c78ca28-a1c2-45b9-9a14-733aae9ee555","Type":"ContainerStarted","Data":"d1e63122c1f6e8e09e16bf429ac482095d5d9ed2b7831cd0960f5a9ce5321921"} Dec 06 15:44:59 crc kubenswrapper[4848]: I1206 15:44:59.128290 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd" event={"ID":"f59ce483-133a-495a-862c-3676661adab8","Type":"ContainerStarted","Data":"2d03583bdf9ba488cd92cbf0f26a87bd309a4ce39be4ad5dc2d419e10c2bd049"} Dec 06 15:44:59 crc kubenswrapper[4848]: I1206 15:44:59.131039 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv" event={"ID":"d99ce981-7f71-4636-94c7-c848830429f3","Type":"ContainerStarted","Data":"6e3b8e88d1ab72764f90c8d21f4bfe758befca7f41d945980c27eb884ea186ef"} Dec 06 15:44:59 crc kubenswrapper[4848]: I1206 15:44:59.150157 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf" event={"ID":"90274bce-5d78-4f62-8265-2218bc58916d","Type":"ContainerStarted","Data":"5d1818f1b4f1edba6f7d5a9a4493fa4f705048c4cbb64fa8db03fed8786f96b3"} Dec 06 15:44:59 crc kubenswrapper[4848]: I1206 15:44:59.164900 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b" event={"ID":"dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd","Type":"ContainerStarted","Data":"9b75dc094a9ec73ca5fd06d560ad04a653e826d4bc1b28737fc5f4c4ffba3d62"} Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.179213 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wnhzc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-69cx9_openstack-operators(706ced85-8889-45e9-bd15-1a2747a9de2e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.180492 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" podUID="706ced85-8889-45e9-bd15-1a2747a9de2e" Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.213140 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7dl8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-79c8c4686c-9vx7t_openstack-operators(95e14baa-5bbb-4bf6-9420-0428c25cc98f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.213416 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9tsxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-6vq9l_openstack-operators(fd64b532-b259-49e5-bd47-62d9d20b6a69): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.214826 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" podUID="95e14baa-5bbb-4bf6-9420-0428c25cc98f" Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.214982 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7vqt4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6c677c69b-plnw5_openstack-operators(9418eec1-6430-4bb9-a7be-6ec83f61c629): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.215099 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" podUID="fd64b532-b259-49e5-bd47-62d9d20b6a69" Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.216139 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" podUID="9418eec1-6430-4bb9-a7be-6ec83f61c629" Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.217296 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rq4kc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-54476ccddc-74npj_openstack-operators(eecbdda5-b888-4fb9-979d-66bb4d8ffcf4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.218505 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" podUID="eecbdda5-b888-4fb9-979d-66bb4d8ffcf4" Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.223048 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vzg46,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-697fb699cf-rhsqm_openstack-operators(1438e750-61e2-4c37-8d03-f22b8ebad123): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 15:44:59 crc kubenswrapper[4848]: E1206 15:44:59.224988 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" podUID="1438e750-61e2-4c37-8d03-f22b8ebad123" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.131529 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.131790 4848 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.131841 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert podName:cf4f4d25-7fc2-411f-9e23-71171162f38a nodeName:}" failed. No retries permitted until 2025-12-06 15:45:16.131826243 +0000 UTC m=+983.429837156 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert") pod "infra-operator-controller-manager-78d48bff9d-4k55x" (UID: "cf4f4d25-7fc2-411f-9e23-71171162f38a") : secret "infra-operator-webhook-server-cert" not found Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.181820 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" event={"ID":"eecbdda5-b888-4fb9-979d-66bb4d8ffcf4","Type":"ContainerStarted","Data":"6f44df63e8981675074358be8c883013e846915ffcf22c675d0d8a1c323a9bb1"} Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.182681 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.190622 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8"] Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.192080 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.199427 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.199661 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" podUID="eecbdda5-b888-4fb9-979d-66bb4d8ffcf4" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.199733 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.213894 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8"] Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.237979 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" event={"ID":"1438e750-61e2-4c37-8d03-f22b8ebad123","Type":"ContainerStarted","Data":"d438e20fc13b597229ab8a9a1a4a60b8d5df09231a763090b1cbc0083b7f2186"} Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.238978 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.241651 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" podUID="1438e750-61e2-4c37-8d03-f22b8ebad123" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.257023 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" event={"ID":"fd64b532-b259-49e5-bd47-62d9d20b6a69","Type":"ContainerStarted","Data":"68b24c99492a2212b9fd217f8bf770d78e3f95e964fc20c887f7c05781fc5537"} Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.257756 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.264981 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" podUID="fd64b532-b259-49e5-bd47-62d9d20b6a69" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.275076 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7" event={"ID":"cf3406a1-eb72-4251-bbe3-45a33235ac96","Type":"ContainerStarted","Data":"e3798e7191435c58987bd2d40f51aa839e91b74baf63aaef3e8900a4fd48c2f5"} Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.280891 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k" event={"ID":"1b49aafe-c450-441c-ade7-0d87b868dc2a","Type":"ContainerStarted","Data":"d49d29d95b90374679af3662ff7a1c73591c18fe9e8660c978978e05e5ae373a"} Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.302587 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc" event={"ID":"fa68cc1e-18ec-42d1-a3de-948ef2cc0804","Type":"ContainerStarted","Data":"ab68a875d873a56cb3f2cb1c0d871ddd77f2e53122ec371df8bd0c1ac3150f00"} Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.316070 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" event={"ID":"706ced85-8889-45e9-bd15-1a2747a9de2e","Type":"ContainerStarted","Data":"fa5a1d6c1575b1b31fd3e59aec449b436dfa9cb12526f655735436c88a526098"} Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.316845 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.333239 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" podUID="706ced85-8889-45e9-bd15-1a2747a9de2e" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.338743 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b49f850f-92fc-4078-8c80-87ee05cbd097-secret-volume\") pod \"collect-profiles-29417265-vphz8\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.347536 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4whc\" (UniqueName: \"kubernetes.io/projected/b49f850f-92fc-4078-8c80-87ee05cbd097-kube-api-access-m4whc\") pod \"collect-profiles-29417265-vphz8\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.347660 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b49f850f-92fc-4078-8c80-87ee05cbd097-config-volume\") pod \"collect-profiles-29417265-vphz8\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.360945 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc" event={"ID":"fee1c62f-115d-472d-8617-32a386cf06c2","Type":"ContainerStarted","Data":"7faff0921b33287f60f3ffe2983cb6a0f6384a7ea0d696d1edbd7d4d99827454"} Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.364596 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" event={"ID":"9418eec1-6430-4bb9-a7be-6ec83f61c629","Type":"ContainerStarted","Data":"7ece11213ece6bf315d8079974a90e396270e9c3b066b66565978a91ddf43a51"} Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.365258 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.365813 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" podUID="9418eec1-6430-4bb9-a7be-6ec83f61c629" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.411056 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" event={"ID":"95e14baa-5bbb-4bf6-9420-0428c25cc98f","Type":"ContainerStarted","Data":"52bf2285088bef7e0818ed8e5f263ef4413e65e6ab371e0ef71c3ce9991e7397"} Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.411976 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.412901 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" podUID="95e14baa-5bbb-4bf6-9420-0428c25cc98f" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.448593 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b49f850f-92fc-4078-8c80-87ee05cbd097-secret-volume\") pod \"collect-profiles-29417265-vphz8\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.448901 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4whc\" (UniqueName: \"kubernetes.io/projected/b49f850f-92fc-4078-8c80-87ee05cbd097-kube-api-access-m4whc\") pod \"collect-profiles-29417265-vphz8\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.448935 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b49f850f-92fc-4078-8c80-87ee05cbd097-config-volume\") pod \"collect-profiles-29417265-vphz8\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.449860 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b49f850f-92fc-4078-8c80-87ee05cbd097-config-volume\") pod \"collect-profiles-29417265-vphz8\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.464242 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b49f850f-92fc-4078-8c80-87ee05cbd097-secret-volume\") pod \"collect-profiles-29417265-vphz8\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.474857 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4whc\" (UniqueName: \"kubernetes.io/projected/b49f850f-92fc-4078-8c80-87ee05cbd097-kube-api-access-m4whc\") pod \"collect-profiles-29417265-vphz8\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.526101 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.650585 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.650876 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.650919 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert podName:ec6dd72c-05cb-49f7-af2a-01a76807175c nodeName:}" failed. No retries permitted until 2025-12-06 15:45:16.650906848 +0000 UTC m=+983.948917761 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fgtgmh" (UID: "ec6dd72c-05cb-49f7-af2a-01a76807175c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.955057 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.955119 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.955533 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 15:45:00 crc kubenswrapper[4848]: E1206 15:45:00.955613 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs podName:bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826 nodeName:}" failed. No retries permitted until 2025-12-06 15:45:16.955591163 +0000 UTC m=+984.253602076 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs") pod "openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" (UID: "bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826") : secret "metrics-server-cert" not found Dec 06 15:45:00 crc kubenswrapper[4848]: I1206 15:45:00.966746 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-webhook-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:45:01 crc kubenswrapper[4848]: E1206 15:45:01.427165 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" podUID="95e14baa-5bbb-4bf6-9420-0428c25cc98f" Dec 06 15:45:01 crc kubenswrapper[4848]: E1206 15:45:01.427717 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" podUID="fd64b532-b259-49e5-bd47-62d9d20b6a69" Dec 06 15:45:01 crc kubenswrapper[4848]: E1206 15:45:01.427800 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" podUID="706ced85-8889-45e9-bd15-1a2747a9de2e" Dec 06 15:45:01 crc kubenswrapper[4848]: E1206 15:45:01.427805 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" podUID="9418eec1-6430-4bb9-a7be-6ec83f61c629" Dec 06 15:45:01 crc kubenswrapper[4848]: E1206 15:45:01.427853 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" podUID="1438e750-61e2-4c37-8d03-f22b8ebad123" Dec 06 15:45:01 crc kubenswrapper[4848]: E1206 15:45:01.441624 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" podUID="eecbdda5-b888-4fb9-979d-66bb4d8ffcf4" Dec 06 15:45:01 crc kubenswrapper[4848]: I1206 15:45:01.796877 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8"] Dec 06 15:45:02 crc kubenswrapper[4848]: W1206 15:45:01.999899 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb49f850f_92fc_4078_8c80_87ee05cbd097.slice/crio-3766d5bd35c875d10905d4dc6bec6030468b700615fb49eea7dee86580d3c7e4 WatchSource:0}: Error finding container 3766d5bd35c875d10905d4dc6bec6030468b700615fb49eea7dee86580d3c7e4: Status 404 returned error can't find the container with id 3766d5bd35c875d10905d4dc6bec6030468b700615fb49eea7dee86580d3c7e4 Dec 06 15:45:02 crc kubenswrapper[4848]: I1206 15:45:02.427160 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" event={"ID":"b49f850f-92fc-4078-8c80-87ee05cbd097","Type":"ContainerStarted","Data":"3766d5bd35c875d10905d4dc6bec6030468b700615fb49eea7dee86580d3c7e4"} Dec 06 15:45:04 crc kubenswrapper[4848]: I1206 15:45:04.270482 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" Dec 06 15:45:04 crc kubenswrapper[4848]: E1206 15:45:04.272541 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" podUID="706ced85-8889-45e9-bd15-1a2747a9de2e" Dec 06 15:45:04 crc kubenswrapper[4848]: I1206 15:45:04.300682 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" Dec 06 15:45:04 crc kubenswrapper[4848]: E1206 15:45:04.302193 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" podUID="1438e750-61e2-4c37-8d03-f22b8ebad123" Dec 06 15:45:04 crc kubenswrapper[4848]: I1206 15:45:04.341295 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" Dec 06 15:45:04 crc kubenswrapper[4848]: E1206 15:45:04.346332 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" podUID="9418eec1-6430-4bb9-a7be-6ec83f61c629" Dec 06 15:45:04 crc kubenswrapper[4848]: I1206 15:45:04.415687 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" Dec 06 15:45:04 crc kubenswrapper[4848]: E1206 15:45:04.419018 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" podUID="fd64b532-b259-49e5-bd47-62d9d20b6a69" Dec 06 15:45:04 crc kubenswrapper[4848]: I1206 15:45:04.651603 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" Dec 06 15:45:04 crc kubenswrapper[4848]: E1206 15:45:04.654192 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" podUID="eecbdda5-b888-4fb9-979d-66bb4d8ffcf4" Dec 06 15:45:04 crc kubenswrapper[4848]: I1206 15:45:04.853518 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" Dec 06 15:45:04 crc kubenswrapper[4848]: E1206 15:45:04.856551 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" podUID="95e14baa-5bbb-4bf6-9420-0428c25cc98f" Dec 06 15:45:16 crc kubenswrapper[4848]: I1206 15:45:16.157029 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:45:16 crc kubenswrapper[4848]: I1206 15:45:16.162934 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf4f4d25-7fc2-411f-9e23-71171162f38a-cert\") pod \"infra-operator-controller-manager-78d48bff9d-4k55x\" (UID: \"cf4f4d25-7fc2-411f-9e23-71171162f38a\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:45:16 crc kubenswrapper[4848]: I1206 15:45:16.300448 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9q2zl" Dec 06 15:45:16 crc kubenswrapper[4848]: I1206 15:45:16.309105 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:45:16 crc kubenswrapper[4848]: I1206 15:45:16.663079 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:45:16 crc kubenswrapper[4848]: I1206 15:45:16.669353 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6dd72c-05cb-49f7-af2a-01a76807175c-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fgtgmh\" (UID: \"ec6dd72c-05cb-49f7-af2a-01a76807175c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:45:16 crc kubenswrapper[4848]: I1206 15:45:16.881518 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-vj9hq" Dec 06 15:45:16 crc kubenswrapper[4848]: I1206 15:45:16.889876 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:45:16 crc kubenswrapper[4848]: I1206 15:45:16.967058 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:45:16 crc kubenswrapper[4848]: I1206 15:45:16.975480 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826-metrics-certs\") pod \"openstack-operator-controller-manager-6fcf4cdbd6-q2rkz\" (UID: \"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826\") " pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:45:17 crc kubenswrapper[4848]: I1206 15:45:17.143319 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-jdc9c" Dec 06 15:45:17 crc kubenswrapper[4848]: I1206 15:45:17.152074 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:45:21 crc kubenswrapper[4848]: E1206 15:45:21.116914 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 06 15:45:21 crc kubenswrapper[4848]: E1206 15:45:21.117571 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pvrzg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pzg55_openstack-operators(deacbe3a-30ba-42bb-a180-f8e2360ba937): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 15:45:21 crc kubenswrapper[4848]: E1206 15:45:21.118660 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" podUID="deacbe3a-30ba-42bb-a180-f8e2360ba937" Dec 06 15:45:21 crc kubenswrapper[4848]: I1206 15:45:21.564973 4848 generic.go:334] "Generic (PLEG): container finished" podID="b49f850f-92fc-4078-8c80-87ee05cbd097" containerID="9589435e3c40d6054683d3420795c2bcf11713d0bb6054ca63f91d1decbf7ced" exitCode=0 Dec 06 15:45:21 crc kubenswrapper[4848]: I1206 15:45:21.565133 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" event={"ID":"b49f850f-92fc-4078-8c80-87ee05cbd097","Type":"ContainerDied","Data":"9589435e3c40d6054683d3420795c2bcf11713d0bb6054ca63f91d1decbf7ced"} Dec 06 15:45:21 crc kubenswrapper[4848]: I1206 15:45:21.620123 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz"] Dec 06 15:45:21 crc kubenswrapper[4848]: I1206 15:45:21.662003 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x"] Dec 06 15:45:21 crc kubenswrapper[4848]: I1206 15:45:21.744638 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh"] Dec 06 15:45:21 crc kubenswrapper[4848]: W1206 15:45:21.770975 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec6dd72c_05cb_49f7_af2a_01a76807175c.slice/crio-d13dc8283d3c92bd937e1453127f580f5fa1f83bf6fa3da7d3732260c792679b WatchSource:0}: Error finding container d13dc8283d3c92bd937e1453127f580f5fa1f83bf6fa3da7d3732260c792679b: Status 404 returned error can't find the container with id d13dc8283d3c92bd937e1453127f580f5fa1f83bf6fa3da7d3732260c792679b Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.580991 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" event={"ID":"93dc2914-4eae-4bd4-a4c3-94122e44f908","Type":"ContainerStarted","Data":"e90f28bdec161daaccfe1804704db2e3910a1fcb4b9f099ff6fa8ea68173652a"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.581307 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" event={"ID":"93dc2914-4eae-4bd4-a4c3-94122e44f908","Type":"ContainerStarted","Data":"3af960fea31f9d0ca91abf8c923644863faf71fb9d360b639fbb33480223ce92"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.581500 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.594055 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w" event={"ID":"7c78ca28-a1c2-45b9-9a14-733aae9ee555","Type":"ContainerStarted","Data":"dc7cf647226875bc1f070e29f3dd66fe074d5916f918e871399b06743a44c1f1"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.595042 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.599228 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k" event={"ID":"1b49aafe-c450-441c-ade7-0d87b868dc2a","Type":"ContainerStarted","Data":"abcd0c124c45ae3f7ddf8f6b89cd1d51dc4fd227db8f7d6ad1a244c5b0368ef0"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.599594 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.599622 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.605866 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.619271 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b" event={"ID":"dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd","Type":"ContainerStarted","Data":"b8971145832e501e78de2ccfbc6d78459fa4be9fc67ad2748e0bcf108dec54bd"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.620163 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.622766 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.623415 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv" event={"ID":"d99ce981-7f71-4636-94c7-c848830429f3","Type":"ContainerStarted","Data":"8b652b83d64d439e97bc4f71ba3990ae027104137e879dc0b8634796d0be3fbb"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.623993 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.625353 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.630627 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" podStartSLOduration=4.388986777 podStartE2EDuration="38.630612815s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:46.884989445 +0000 UTC m=+954.183000358" lastFinishedPulling="2025-12-06 15:45:21.126615483 +0000 UTC m=+988.424626396" observedRunningTime="2025-12-06 15:45:22.627725547 +0000 UTC m=+989.925736460" watchObservedRunningTime="2025-12-06 15:45:22.630612815 +0000 UTC m=+989.928623728" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.634923 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc" event={"ID":"fa68cc1e-18ec-42d1-a3de-948ef2cc0804","Type":"ContainerStarted","Data":"8e7657e5e119d52652ef912cbc0532da3167237aec39f8909ad991fe29002175"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.635723 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.642500 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.650642 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jtn8k" podStartSLOduration=4.177991415 podStartE2EDuration="39.650625467s" podCreationTimestamp="2025-12-06 15:44:43 +0000 UTC" firstStartedPulling="2025-12-06 15:44:45.755932132 +0000 UTC m=+953.053943045" lastFinishedPulling="2025-12-06 15:45:21.228566184 +0000 UTC m=+988.526577097" observedRunningTime="2025-12-06 15:45:22.650091673 +0000 UTC m=+989.948102596" watchObservedRunningTime="2025-12-06 15:45:22.650625467 +0000 UTC m=+989.948636380" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.654739 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8" event={"ID":"0d42fecd-1b6d-4f29-816c-38515c0a547c","Type":"ContainerStarted","Data":"edb6ba451545a9e9801c2f0c6cf4344550be6c4a957bcd9b8ef3b1c84ca80e6e"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.655178 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.660686 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.664040 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" event={"ID":"1438e750-61e2-4c37-8d03-f22b8ebad123","Type":"ContainerStarted","Data":"288119002d352104de06a1483053fe26f6da79d3257e9c49f98f47de98fc8e2d"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.672028 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-x7d2b" podStartSLOduration=4.269482399 podStartE2EDuration="38.672010636s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:46.825610226 +0000 UTC m=+954.123621139" lastFinishedPulling="2025-12-06 15:45:21.228138473 +0000 UTC m=+988.526149376" observedRunningTime="2025-12-06 15:45:22.669064536 +0000 UTC m=+989.967075449" watchObservedRunningTime="2025-12-06 15:45:22.672010636 +0000 UTC m=+989.970021549" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.680854 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" event={"ID":"9418eec1-6430-4bb9-a7be-6ec83f61c629","Type":"ContainerStarted","Data":"e93918631e007553cbaa31686625d6a17a7ea73d01ba5aa5b999010cd94e0863"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.695798 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" event={"ID":"ec6dd72c-05cb-49f7-af2a-01a76807175c","Type":"ContainerStarted","Data":"d13dc8283d3c92bd937e1453127f580f5fa1f83bf6fa3da7d3732260c792679b"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.707953 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vpn6w" podStartSLOduration=4.306435141 podStartE2EDuration="38.70792923s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:46.825787551 +0000 UTC m=+954.123798504" lastFinishedPulling="2025-12-06 15:45:21.22728168 +0000 UTC m=+988.525292593" observedRunningTime="2025-12-06 15:45:22.706720767 +0000 UTC m=+990.004731690" watchObservedRunningTime="2025-12-06 15:45:22.70792923 +0000 UTC m=+990.005940143" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.717963 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" event={"ID":"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826","Type":"ContainerStarted","Data":"399519eaf4951923c8ea3b9e510f547c3f607b93a68988a95121ebfaf79680ec"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.718016 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" event={"ID":"bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826","Type":"ContainerStarted","Data":"ed79988f8d4f95d27abf95f3a567b6ce2236466fd0ad74721618c73abcb54f59"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.718606 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.755873 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" event={"ID":"eecbdda5-b888-4fb9-979d-66bb4d8ffcf4","Type":"ContainerStarted","Data":"597fcda2a094820450c542004e9e722f25bb2c597cd705a9e31049ef7f0b1c12"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.763016 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf" event={"ID":"90274bce-5d78-4f62-8265-2218bc58916d","Type":"ContainerStarted","Data":"1e925c98aa44dd845c25fb4829f5811296be1662c778095a707b69168802805e"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.764182 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.764209 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9zcn8" podStartSLOduration=4.45591748 podStartE2EDuration="38.764194054s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:46.825121073 +0000 UTC m=+954.123131986" lastFinishedPulling="2025-12-06 15:45:21.133397647 +0000 UTC m=+988.431408560" observedRunningTime="2025-12-06 15:45:22.753936456 +0000 UTC m=+990.051947369" watchObservedRunningTime="2025-12-06 15:45:22.764194054 +0000 UTC m=+990.062204967" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.764512 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-dzsgv" podStartSLOduration=4.355683388 podStartE2EDuration="39.764508052s" podCreationTimestamp="2025-12-06 15:44:43 +0000 UTC" firstStartedPulling="2025-12-06 15:44:45.764562896 +0000 UTC m=+953.062573809" lastFinishedPulling="2025-12-06 15:45:21.17338756 +0000 UTC m=+988.471398473" observedRunningTime="2025-12-06 15:45:22.741460648 +0000 UTC m=+990.039471551" watchObservedRunningTime="2025-12-06 15:45:22.764508052 +0000 UTC m=+990.062518965" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.776025 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.776398 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" event={"ID":"fd64b532-b259-49e5-bd47-62d9d20b6a69","Type":"ContainerStarted","Data":"e473b6befedcbc7ce0cd955f179023112e6e672763693b3d111066acf3f52302"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.788168 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-plnw5" podStartSLOduration=4.009336973 podStartE2EDuration="39.788151213s" podCreationTimestamp="2025-12-06 15:44:43 +0000 UTC" firstStartedPulling="2025-12-06 15:44:45.587274862 +0000 UTC m=+952.885285775" lastFinishedPulling="2025-12-06 15:45:21.366089102 +0000 UTC m=+988.664100015" observedRunningTime="2025-12-06 15:45:22.785097591 +0000 UTC m=+990.083108504" watchObservedRunningTime="2025-12-06 15:45:22.788151213 +0000 UTC m=+990.086162126" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.796871 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" event={"ID":"d2b8bc9b-a359-4fe9-a039-200eed4f7218","Type":"ContainerStarted","Data":"7df4da9694de016994e6ab023425cc08eb1e430366c2f3688e995183853ed59c"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.796922 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" event={"ID":"d2b8bc9b-a359-4fe9-a039-200eed4f7218","Type":"ContainerStarted","Data":"4f2c6157d0d97c423e2a716c025091ea5fd324306b70944621849280be88566c"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.797508 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.803953 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7" event={"ID":"cf3406a1-eb72-4251-bbe3-45a33235ac96","Type":"ContainerStarted","Data":"b52c183e24baacf1a4c0b14430199361d7f54e1046058be2031d023566e582e8"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.805035 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.810980 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.819218 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn" event={"ID":"76bfd2a6-6774-4992-91ec-c73327b11bd8","Type":"ContainerStarted","Data":"0d5809ec20ace0d0558df2c5dd2ed92921a8ec7c1a4d25bcd0e936ba596c3dc3"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.820096 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.829317 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.835432 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" event={"ID":"cf4f4d25-7fc2-411f-9e23-71171162f38a","Type":"ContainerStarted","Data":"5d6946fc358e8c1677e667a39b4e0b2804ee01ff5c1938c924c3bd00af6b5823"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.839577 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" podStartSLOduration=38.839558776 podStartE2EDuration="38.839558776s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:45:22.830772638 +0000 UTC m=+990.128783551" watchObservedRunningTime="2025-12-06 15:45:22.839558776 +0000 UTC m=+990.137569689" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.854789 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" event={"ID":"706ced85-8889-45e9-bd15-1a2747a9de2e","Type":"ContainerStarted","Data":"9e663de7cb7faa0b0995cabd837c7fa0846e4e227e658ed87a2d1297ecd17b92"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.856349 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-s6hbc" podStartSLOduration=4.477346431 podStartE2EDuration="38.856328321s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:46.818934125 +0000 UTC m=+954.116945038" lastFinishedPulling="2025-12-06 15:45:21.197916015 +0000 UTC m=+988.495926928" observedRunningTime="2025-12-06 15:45:22.854184433 +0000 UTC m=+990.152195356" watchObservedRunningTime="2025-12-06 15:45:22.856328321 +0000 UTC m=+990.154339244" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.857337 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc" event={"ID":"fee1c62f-115d-472d-8617-32a386cf06c2","Type":"ContainerStarted","Data":"de76eb7dd4ce96cf62265a9dd18a80a625451b7e7fa83a7694498f3c1246e2db"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.858136 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.865785 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.866813 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd" event={"ID":"f59ce483-133a-495a-862c-3676661adab8","Type":"ContainerStarted","Data":"763cc7be66f426e95fbdad8c7f6880abf3056b7b524daa76d12a52497865be99"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.867682 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.869587 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.877248 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" event={"ID":"95e14baa-5bbb-4bf6-9420-0428c25cc98f","Type":"ContainerStarted","Data":"2bb86f33f4672cd2e343f68e8abd2b89464c705f9e9ecb8e85b79bb2bd3d5449"} Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.880227 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-rhsqm" podStartSLOduration=3.673642178 podStartE2EDuration="39.880213408s" podCreationTimestamp="2025-12-06 15:44:43 +0000 UTC" firstStartedPulling="2025-12-06 15:44:45.210372529 +0000 UTC m=+952.508383442" lastFinishedPulling="2025-12-06 15:45:21.416943759 +0000 UTC m=+988.714954672" observedRunningTime="2025-12-06 15:45:22.877105244 +0000 UTC m=+990.175116167" watchObservedRunningTime="2025-12-06 15:45:22.880213408 +0000 UTC m=+990.178224321" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.907650 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-69cx9" podStartSLOduration=4.120515706 podStartE2EDuration="39.907628581s" podCreationTimestamp="2025-12-06 15:44:43 +0000 UTC" firstStartedPulling="2025-12-06 15:44:45.580167819 +0000 UTC m=+952.878178732" lastFinishedPulling="2025-12-06 15:45:21.367280694 +0000 UTC m=+988.665291607" observedRunningTime="2025-12-06 15:45:22.901903356 +0000 UTC m=+990.199914269" watchObservedRunningTime="2025-12-06 15:45:22.907628581 +0000 UTC m=+990.205639494" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.938290 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-4vwqn" podStartSLOduration=3.4613754820000002 podStartE2EDuration="38.938265551s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:45.752143789 +0000 UTC m=+953.050154702" lastFinishedPulling="2025-12-06 15:45:21.229033858 +0000 UTC m=+988.527044771" observedRunningTime="2025-12-06 15:45:22.93603047 +0000 UTC m=+990.234041393" watchObservedRunningTime="2025-12-06 15:45:22.938265551 +0000 UTC m=+990.236276464" Dec 06 15:45:22 crc kubenswrapper[4848]: I1206 15:45:22.975903 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-x7tzd" podStartSLOduration=4.488236326 podStartE2EDuration="38.97588077s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:46.872963089 +0000 UTC m=+954.170974002" lastFinishedPulling="2025-12-06 15:45:21.360607533 +0000 UTC m=+988.658618446" observedRunningTime="2025-12-06 15:45:22.966152396 +0000 UTC m=+990.264163319" watchObservedRunningTime="2025-12-06 15:45:22.97588077 +0000 UTC m=+990.273891683" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.025430 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-cwth7" podStartSLOduration=4.664624476 podStartE2EDuration="39.025406962s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:46.867631044 +0000 UTC m=+954.165641957" lastFinishedPulling="2025-12-06 15:45:21.22841353 +0000 UTC m=+988.526424443" observedRunningTime="2025-12-06 15:45:23.017322043 +0000 UTC m=+990.315332956" watchObservedRunningTime="2025-12-06 15:45:23.025406962 +0000 UTC m=+990.323417875" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.064177 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4gqwc" podStartSLOduration=3.705928999 podStartE2EDuration="39.064157172s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:45.759340524 +0000 UTC m=+953.057351427" lastFinishedPulling="2025-12-06 15:45:21.117568687 +0000 UTC m=+988.415579600" observedRunningTime="2025-12-06 15:45:23.055145868 +0000 UTC m=+990.353156781" watchObservedRunningTime="2025-12-06 15:45:23.064157172 +0000 UTC m=+990.362168095" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.096652 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-54476ccddc-74npj" podStartSLOduration=3.478926848 podStartE2EDuration="39.096633002s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:45.750150175 +0000 UTC m=+953.048161088" lastFinishedPulling="2025-12-06 15:45:21.367856329 +0000 UTC m=+988.665867242" observedRunningTime="2025-12-06 15:45:23.090196388 +0000 UTC m=+990.388207301" watchObservedRunningTime="2025-12-06 15:45:23.096633002 +0000 UTC m=+990.394643905" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.111608 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6vq9l" podStartSLOduration=4.452688917 podStartE2EDuration="40.111589757s" podCreationTimestamp="2025-12-06 15:44:43 +0000 UTC" firstStartedPulling="2025-12-06 15:44:45.750134294 +0000 UTC m=+953.048145207" lastFinishedPulling="2025-12-06 15:45:21.409035134 +0000 UTC m=+988.707046047" observedRunningTime="2025-12-06 15:45:23.109487451 +0000 UTC m=+990.407498364" watchObservedRunningTime="2025-12-06 15:45:23.111589757 +0000 UTC m=+990.409600670" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.144348 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bxhmf" podStartSLOduration=4.649110976 podStartE2EDuration="39.144324695s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:46.825177604 +0000 UTC m=+954.123188517" lastFinishedPulling="2025-12-06 15:45:21.320391323 +0000 UTC m=+988.618402236" observedRunningTime="2025-12-06 15:45:23.13385221 +0000 UTC m=+990.431863123" watchObservedRunningTime="2025-12-06 15:45:23.144324695 +0000 UTC m=+990.442335608" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.170226 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" podStartSLOduration=4.898881793 podStartE2EDuration="39.170206515s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:46.87411502 +0000 UTC m=+954.172125933" lastFinishedPulling="2025-12-06 15:45:21.145439742 +0000 UTC m=+988.443450655" observedRunningTime="2025-12-06 15:45:23.150946343 +0000 UTC m=+990.448957266" watchObservedRunningTime="2025-12-06 15:45:23.170206515 +0000 UTC m=+990.468217428" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.174492 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-9vx7t" podStartSLOduration=4.630899772 podStartE2EDuration="39.174474911s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:46.824719452 +0000 UTC m=+954.122730365" lastFinishedPulling="2025-12-06 15:45:21.368294601 +0000 UTC m=+988.666305504" observedRunningTime="2025-12-06 15:45:23.171008958 +0000 UTC m=+990.469019891" watchObservedRunningTime="2025-12-06 15:45:23.174474911 +0000 UTC m=+990.472485824" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.451000 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.493609 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b49f850f-92fc-4078-8c80-87ee05cbd097-config-volume\") pod \"b49f850f-92fc-4078-8c80-87ee05cbd097\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.493735 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b49f850f-92fc-4078-8c80-87ee05cbd097-secret-volume\") pod \"b49f850f-92fc-4078-8c80-87ee05cbd097\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.493852 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4whc\" (UniqueName: \"kubernetes.io/projected/b49f850f-92fc-4078-8c80-87ee05cbd097-kube-api-access-m4whc\") pod \"b49f850f-92fc-4078-8c80-87ee05cbd097\" (UID: \"b49f850f-92fc-4078-8c80-87ee05cbd097\") " Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.495689 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b49f850f-92fc-4078-8c80-87ee05cbd097-config-volume" (OuterVolumeSpecName: "config-volume") pod "b49f850f-92fc-4078-8c80-87ee05cbd097" (UID: "b49f850f-92fc-4078-8c80-87ee05cbd097"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.499895 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49f850f-92fc-4078-8c80-87ee05cbd097-kube-api-access-m4whc" (OuterVolumeSpecName: "kube-api-access-m4whc") pod "b49f850f-92fc-4078-8c80-87ee05cbd097" (UID: "b49f850f-92fc-4078-8c80-87ee05cbd097"). InnerVolumeSpecName "kube-api-access-m4whc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.519747 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49f850f-92fc-4078-8c80-87ee05cbd097-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b49f850f-92fc-4078-8c80-87ee05cbd097" (UID: "b49f850f-92fc-4078-8c80-87ee05cbd097"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.595297 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4whc\" (UniqueName: \"kubernetes.io/projected/b49f850f-92fc-4078-8c80-87ee05cbd097-kube-api-access-m4whc\") on node \"crc\" DevicePath \"\"" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.595334 4848 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b49f850f-92fc-4078-8c80-87ee05cbd097-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.595343 4848 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b49f850f-92fc-4078-8c80-87ee05cbd097-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.886447 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" event={"ID":"b49f850f-92fc-4078-8c80-87ee05cbd097","Type":"ContainerDied","Data":"3766d5bd35c875d10905d4dc6bec6030468b700615fb49eea7dee86580d3c7e4"} Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.886508 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3766d5bd35c875d10905d4dc6bec6030468b700615fb49eea7dee86580d3c7e4" Dec 06 15:45:23 crc kubenswrapper[4848]: I1206 15:45:23.886682 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417265-vphz8" Dec 06 15:45:25 crc kubenswrapper[4848]: I1206 15:45:25.903090 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" event={"ID":"ec6dd72c-05cb-49f7-af2a-01a76807175c","Type":"ContainerStarted","Data":"30a8a69d723f3c273d47991f25b463bdb790770897268f8cb568c2c6853602fd"} Dec 06 15:45:25 crc kubenswrapper[4848]: I1206 15:45:25.903394 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" event={"ID":"ec6dd72c-05cb-49f7-af2a-01a76807175c","Type":"ContainerStarted","Data":"1b8f87762677fb204377392687d565cbe89c441e42ce05cd73758c82bf86e781"} Dec 06 15:45:25 crc kubenswrapper[4848]: I1206 15:45:25.903409 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:45:25 crc kubenswrapper[4848]: I1206 15:45:25.905124 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" event={"ID":"cf4f4d25-7fc2-411f-9e23-71171162f38a","Type":"ContainerStarted","Data":"15aee858e3c6a10e8c75bdfe8cfc3259f6b10dddc9b6e2bca15a7e58a7504179"} Dec 06 15:45:25 crc kubenswrapper[4848]: I1206 15:45:25.905170 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" event={"ID":"cf4f4d25-7fc2-411f-9e23-71171162f38a","Type":"ContainerStarted","Data":"6a3b5ef09f7abafcf8eed908d82ff4d112225228712d47cbe07b2010f51e0f49"} Dec 06 15:45:25 crc kubenswrapper[4848]: I1206 15:45:25.905236 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:45:25 crc kubenswrapper[4848]: I1206 15:45:25.929308 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" podStartSLOduration=38.769466987 podStartE2EDuration="41.929289036s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:45:21.791164179 +0000 UTC m=+989.089175092" lastFinishedPulling="2025-12-06 15:45:24.950986228 +0000 UTC m=+992.248997141" observedRunningTime="2025-12-06 15:45:25.92721871 +0000 UTC m=+993.225229623" watchObservedRunningTime="2025-12-06 15:45:25.929289036 +0000 UTC m=+993.227299949" Dec 06 15:45:25 crc kubenswrapper[4848]: I1206 15:45:25.948568 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" podStartSLOduration=38.714991871 podStartE2EDuration="41.948550028s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:45:21.736422536 +0000 UTC m=+989.034433449" lastFinishedPulling="2025-12-06 15:45:24.969980693 +0000 UTC m=+992.267991606" observedRunningTime="2025-12-06 15:45:25.944106647 +0000 UTC m=+993.242117560" watchObservedRunningTime="2025-12-06 15:45:25.948550028 +0000 UTC m=+993.246560941" Dec 06 15:45:27 crc kubenswrapper[4848]: I1206 15:45:27.157693 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6fcf4cdbd6-q2rkz" Dec 06 15:45:35 crc kubenswrapper[4848]: I1206 15:45:35.253061 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-5h8p9" Dec 06 15:45:35 crc kubenswrapper[4848]: I1206 15:45:35.401075 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-s6ttq" Dec 06 15:45:35 crc kubenswrapper[4848]: E1206 15:45:35.968624 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" podUID="deacbe3a-30ba-42bb-a180-f8e2360ba937" Dec 06 15:45:36 crc kubenswrapper[4848]: I1206 15:45:36.316165 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-4k55x" Dec 06 15:45:36 crc kubenswrapper[4848]: I1206 15:45:36.896100 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fgtgmh" Dec 06 15:45:52 crc kubenswrapper[4848]: I1206 15:45:52.091641 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" event={"ID":"deacbe3a-30ba-42bb-a180-f8e2360ba937","Type":"ContainerStarted","Data":"29bcab27aba4f1e94cc891e947c9be9244baec573c79ee620adcf6d3ab962d26"} Dec 06 15:45:52 crc kubenswrapper[4848]: I1206 15:45:52.114635 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pzg55" podStartSLOduration=3.862600245 podStartE2EDuration="1m8.114616613s" podCreationTimestamp="2025-12-06 15:44:44 +0000 UTC" firstStartedPulling="2025-12-06 15:44:46.886066584 +0000 UTC m=+954.184077507" lastFinishedPulling="2025-12-06 15:45:51.138082962 +0000 UTC m=+1018.436093875" observedRunningTime="2025-12-06 15:45:52.114113399 +0000 UTC m=+1019.412124482" watchObservedRunningTime="2025-12-06 15:45:52.114616613 +0000 UTC m=+1019.412627526" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.358881 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-26j2b"] Dec 06 15:46:08 crc kubenswrapper[4848]: E1206 15:46:08.359657 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49f850f-92fc-4078-8c80-87ee05cbd097" containerName="collect-profiles" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.359672 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49f850f-92fc-4078-8c80-87ee05cbd097" containerName="collect-profiles" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.359915 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b49f850f-92fc-4078-8c80-87ee05cbd097" containerName="collect-profiles" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.364464 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.368770 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.368862 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.368934 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-6d8np" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.369034 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.389790 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-26j2b"] Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.427806 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbsv6\" (UniqueName: \"kubernetes.io/projected/92c91026-c3e6-4432-8433-5bd51e9288a0-kube-api-access-qbsv6\") pod \"dnsmasq-dns-675f4bcbfc-26j2b\" (UID: \"92c91026-c3e6-4432-8433-5bd51e9288a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.427926 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92c91026-c3e6-4432-8433-5bd51e9288a0-config\") pod \"dnsmasq-dns-675f4bcbfc-26j2b\" (UID: \"92c91026-c3e6-4432-8433-5bd51e9288a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.435533 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6z85k"] Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.437846 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.440318 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.481581 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6z85k"] Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.529617 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbsv6\" (UniqueName: \"kubernetes.io/projected/92c91026-c3e6-4432-8433-5bd51e9288a0-kube-api-access-qbsv6\") pod \"dnsmasq-dns-675f4bcbfc-26j2b\" (UID: \"92c91026-c3e6-4432-8433-5bd51e9288a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.529982 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhtp9\" (UniqueName: \"kubernetes.io/projected/a19cfcdb-e386-4dc3-a94c-095985b64a1d-kube-api-access-fhtp9\") pod \"dnsmasq-dns-78dd6ddcc-6z85k\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.530166 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-config\") pod \"dnsmasq-dns-78dd6ddcc-6z85k\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.530290 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92c91026-c3e6-4432-8433-5bd51e9288a0-config\") pod \"dnsmasq-dns-675f4bcbfc-26j2b\" (UID: \"92c91026-c3e6-4432-8433-5bd51e9288a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.530430 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6z85k\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.531597 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92c91026-c3e6-4432-8433-5bd51e9288a0-config\") pod \"dnsmasq-dns-675f4bcbfc-26j2b\" (UID: \"92c91026-c3e6-4432-8433-5bd51e9288a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.554026 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbsv6\" (UniqueName: \"kubernetes.io/projected/92c91026-c3e6-4432-8433-5bd51e9288a0-kube-api-access-qbsv6\") pod \"dnsmasq-dns-675f4bcbfc-26j2b\" (UID: \"92c91026-c3e6-4432-8433-5bd51e9288a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.631755 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-config\") pod \"dnsmasq-dns-78dd6ddcc-6z85k\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.632433 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6z85k\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.632585 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtp9\" (UniqueName: \"kubernetes.io/projected/a19cfcdb-e386-4dc3-a94c-095985b64a1d-kube-api-access-fhtp9\") pod \"dnsmasq-dns-78dd6ddcc-6z85k\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.633903 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-config\") pod \"dnsmasq-dns-78dd6ddcc-6z85k\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.634655 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6z85k\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.654201 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhtp9\" (UniqueName: \"kubernetes.io/projected/a19cfcdb-e386-4dc3-a94c-095985b64a1d-kube-api-access-fhtp9\") pod \"dnsmasq-dns-78dd6ddcc-6z85k\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.689254 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" Dec 06 15:46:08 crc kubenswrapper[4848]: I1206 15:46:08.794347 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:09 crc kubenswrapper[4848]: I1206 15:46:09.092358 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-26j2b"] Dec 06 15:46:09 crc kubenswrapper[4848]: I1206 15:46:09.098472 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 15:46:09 crc kubenswrapper[4848]: I1206 15:46:09.199956 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" event={"ID":"92c91026-c3e6-4432-8433-5bd51e9288a0","Type":"ContainerStarted","Data":"8d84d79e605a28bfeadb386edd348e7ed889d8e043c3e47d4e5d80b97b5524eb"} Dec 06 15:46:09 crc kubenswrapper[4848]: I1206 15:46:09.221433 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6z85k"] Dec 06 15:46:09 crc kubenswrapper[4848]: W1206 15:46:09.223480 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda19cfcdb_e386_4dc3_a94c_095985b64a1d.slice/crio-fdb7c65d88af31ffcb7ec8e7faa88dad6fc9cd8e24a7b6daf0139b738c3908db WatchSource:0}: Error finding container fdb7c65d88af31ffcb7ec8e7faa88dad6fc9cd8e24a7b6daf0139b738c3908db: Status 404 returned error can't find the container with id fdb7c65d88af31ffcb7ec8e7faa88dad6fc9cd8e24a7b6daf0139b738c3908db Dec 06 15:46:10 crc kubenswrapper[4848]: I1206 15:46:10.207109 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" event={"ID":"a19cfcdb-e386-4dc3-a94c-095985b64a1d","Type":"ContainerStarted","Data":"fdb7c65d88af31ffcb7ec8e7faa88dad6fc9cd8e24a7b6daf0139b738c3908db"} Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.432253 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-26j2b"] Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.459104 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9msbr"] Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.465394 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.481043 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9msbr"] Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.577736 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9msbr\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.577887 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-config\") pod \"dnsmasq-dns-666b6646f7-9msbr\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.578065 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvjcv\" (UniqueName: \"kubernetes.io/projected/401b18b9-ead2-44fb-b61e-6ece6116f6f1-kube-api-access-gvjcv\") pod \"dnsmasq-dns-666b6646f7-9msbr\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.680164 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-config\") pod \"dnsmasq-dns-666b6646f7-9msbr\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.680289 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvjcv\" (UniqueName: \"kubernetes.io/projected/401b18b9-ead2-44fb-b61e-6ece6116f6f1-kube-api-access-gvjcv\") pod \"dnsmasq-dns-666b6646f7-9msbr\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.680344 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9msbr\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.681292 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-config\") pod \"dnsmasq-dns-666b6646f7-9msbr\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.681306 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9msbr\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.703930 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvjcv\" (UniqueName: \"kubernetes.io/projected/401b18b9-ead2-44fb-b61e-6ece6116f6f1-kube-api-access-gvjcv\") pod \"dnsmasq-dns-666b6646f7-9msbr\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.730179 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6z85k"] Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.741750 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l66z5"] Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.743667 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.748880 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l66z5"] Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.804250 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.804310 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrzpq\" (UniqueName: \"kubernetes.io/projected/a8715bfa-e1a1-4467-8543-6807e7facc8e-kube-api-access-nrzpq\") pod \"dnsmasq-dns-57d769cc4f-l66z5\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.804401 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-l66z5\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.805099 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-config\") pod \"dnsmasq-dns-57d769cc4f-l66z5\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.905980 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-config\") pod \"dnsmasq-dns-57d769cc4f-l66z5\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.906294 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrzpq\" (UniqueName: \"kubernetes.io/projected/a8715bfa-e1a1-4467-8543-6807e7facc8e-kube-api-access-nrzpq\") pod \"dnsmasq-dns-57d769cc4f-l66z5\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.906333 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-l66z5\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.907188 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-l66z5\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.907265 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-config\") pod \"dnsmasq-dns-57d769cc4f-l66z5\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:11 crc kubenswrapper[4848]: I1206 15:46:11.926328 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrzpq\" (UniqueName: \"kubernetes.io/projected/a8715bfa-e1a1-4467-8543-6807e7facc8e-kube-api-access-nrzpq\") pod \"dnsmasq-dns-57d769cc4f-l66z5\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.568263 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.592067 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9msbr"] Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.612008 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.618933 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.619273 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.621985 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.623990 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vd4vp" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.624078 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.624125 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.624201 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.624215 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.624210 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.772482 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.772518 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.772538 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.772742 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.772821 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.772860 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.772918 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ct5c\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-kube-api-access-9ct5c\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.772987 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.773037 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.773070 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.773101 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.875576 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.876599 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.876640 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.876738 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ct5c\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-kube-api-access-9ct5c\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.876781 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.876799 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.876861 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.876884 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.877605 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.877645 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.877664 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.877990 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.878510 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.878751 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.880813 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.883435 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.883742 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.883830 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.886392 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.888327 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.891007 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.894293 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ct5c\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-kube-api-access-9ct5c\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.913725 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " pod="openstack/rabbitmq-server-0" Dec 06 15:46:12 crc kubenswrapper[4848]: I1206 15:46:12.949926 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.004473 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.006770 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.006874 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.009495 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.013483 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.015439 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.015842 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.016045 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.016627 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.021525 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5pvs9" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.059806 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l66z5"] Dec 06 15:46:13 crc kubenswrapper[4848]: W1206 15:46:13.060777 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8715bfa_e1a1_4467_8543_6807e7facc8e.slice/crio-a3f6109fe48f288af7a0f3edcdf294c9c799dfd783eea73bc5c476a49be590a1 WatchSource:0}: Error finding container a3f6109fe48f288af7a0f3edcdf294c9c799dfd783eea73bc5c476a49be590a1: Status 404 returned error can't find the container with id a3f6109fe48f288af7a0f3edcdf294c9c799dfd783eea73bc5c476a49be590a1 Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.182156 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.182249 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.182274 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjd4k\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-kube-api-access-gjd4k\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.182344 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.182515 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.182643 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.182769 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.182933 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dda76265-1c2c-4409-8460-99bc3ab509c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.182957 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dda76265-1c2c-4409-8460-99bc3ab509c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.182982 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.183008 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.284670 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.284764 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.284821 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dda76265-1c2c-4409-8460-99bc3ab509c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.284846 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dda76265-1c2c-4409-8460-99bc3ab509c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.284875 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.284914 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.284957 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.284985 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjd4k\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-kube-api-access-gjd4k\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.285006 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.285046 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.285085 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.287015 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.287963 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.287967 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.288343 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.314473 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.462289 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 15:46:13 crc kubenswrapper[4848]: W1206 15:46:13.478139 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5e9e88c_6b1c_44dc_b00f_f8e4a25c5a75.slice/crio-959b8132da20e7c73f4fe18f58777e716e0a81d28816984e7c4ed67dbf73bb93 WatchSource:0}: Error finding container 959b8132da20e7c73f4fe18f58777e716e0a81d28816984e7c4ed67dbf73bb93: Status 404 returned error can't find the container with id 959b8132da20e7c73f4fe18f58777e716e0a81d28816984e7c4ed67dbf73bb93 Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.587162 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" event={"ID":"a8715bfa-e1a1-4467-8543-6807e7facc8e","Type":"ContainerStarted","Data":"a3f6109fe48f288af7a0f3edcdf294c9c799dfd783eea73bc5c476a49be590a1"} Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.589807 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75","Type":"ContainerStarted","Data":"959b8132da20e7c73f4fe18f58777e716e0a81d28816984e7c4ed67dbf73bb93"} Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.591893 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" event={"ID":"401b18b9-ead2-44fb-b61e-6ece6116f6f1","Type":"ContainerStarted","Data":"75cc376258ccf94239e73ae28a2c5c1377131f38525f00b9e61002e3b9d6bf8f"} Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.770479 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.770909 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.771043 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dda76265-1c2c-4409-8460-99bc3ab509c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.771114 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dda76265-1c2c-4409-8460-99bc3ab509c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.771341 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.788975 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.792380 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjd4k\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-kube-api-access-gjd4k\") pod \"rabbitmq-cell1-server-0\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:13 crc kubenswrapper[4848]: I1206 15:46:13.950653 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.263325 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.265437 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.267464 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-b9j2x" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.272030 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.272464 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.272612 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.276554 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.283778 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.399867 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c66a34-c907-4841-92b1-0799522b6bd5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.399911 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1c66a34-c907-4841-92b1-0799522b6bd5-config-data-default\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.399938 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1c66a34-c907-4841-92b1-0799522b6bd5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.399956 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1c66a34-c907-4841-92b1-0799522b6bd5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.400008 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8gwg\" (UniqueName: \"kubernetes.io/projected/a1c66a34-c907-4841-92b1-0799522b6bd5-kube-api-access-g8gwg\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.400035 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.400066 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c66a34-c907-4841-92b1-0799522b6bd5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.400086 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1c66a34-c907-4841-92b1-0799522b6bd5-kolla-config\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.503374 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c66a34-c907-4841-92b1-0799522b6bd5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.503422 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1c66a34-c907-4841-92b1-0799522b6bd5-config-data-default\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.503447 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1c66a34-c907-4841-92b1-0799522b6bd5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.503473 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1c66a34-c907-4841-92b1-0799522b6bd5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.503506 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8gwg\" (UniqueName: \"kubernetes.io/projected/a1c66a34-c907-4841-92b1-0799522b6bd5-kube-api-access-g8gwg\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.503535 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.503558 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c66a34-c907-4841-92b1-0799522b6bd5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.503578 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1c66a34-c907-4841-92b1-0799522b6bd5-kolla-config\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.504231 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1c66a34-c907-4841-92b1-0799522b6bd5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.504258 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1c66a34-c907-4841-92b1-0799522b6bd5-kolla-config\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.504611 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.507582 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c66a34-c907-4841-92b1-0799522b6bd5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.508194 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1c66a34-c907-4841-92b1-0799522b6bd5-config-data-default\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.510464 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c66a34-c907-4841-92b1-0799522b6bd5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.511333 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1c66a34-c907-4841-92b1-0799522b6bd5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.529592 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.534302 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8gwg\" (UniqueName: \"kubernetes.io/projected/a1c66a34-c907-4841-92b1-0799522b6bd5-kube-api-access-g8gwg\") pod \"openstack-galera-0\" (UID: \"a1c66a34-c907-4841-92b1-0799522b6bd5\") " pod="openstack/openstack-galera-0" Dec 06 15:46:14 crc kubenswrapper[4848]: I1206 15:46:14.597533 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.551785 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.553768 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.556328 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-vj58m" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.556485 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.556504 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.556879 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.562746 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.726600 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f65j5\" (UniqueName: \"kubernetes.io/projected/bb6169da-9db4-4d22-bd22-aaf2322103df-kube-api-access-f65j5\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.726646 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.726669 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6169da-9db4-4d22-bd22-aaf2322103df-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.726708 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb6169da-9db4-4d22-bd22-aaf2322103df-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.726734 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb6169da-9db4-4d22-bd22-aaf2322103df-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.726751 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6169da-9db4-4d22-bd22-aaf2322103df-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.726771 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb6169da-9db4-4d22-bd22-aaf2322103df-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.726793 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb6169da-9db4-4d22-bd22-aaf2322103df-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.827924 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb6169da-9db4-4d22-bd22-aaf2322103df-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.827974 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb6169da-9db4-4d22-bd22-aaf2322103df-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.827990 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6169da-9db4-4d22-bd22-aaf2322103df-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.828012 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb6169da-9db4-4d22-bd22-aaf2322103df-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.828032 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb6169da-9db4-4d22-bd22-aaf2322103df-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.828108 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f65j5\" (UniqueName: \"kubernetes.io/projected/bb6169da-9db4-4d22-bd22-aaf2322103df-kube-api-access-f65j5\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.828124 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.828143 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6169da-9db4-4d22-bd22-aaf2322103df-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.829033 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.829456 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb6169da-9db4-4d22-bd22-aaf2322103df-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.829518 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6169da-9db4-4d22-bd22-aaf2322103df-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.829734 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb6169da-9db4-4d22-bd22-aaf2322103df-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.829798 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb6169da-9db4-4d22-bd22-aaf2322103df-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.841921 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb6169da-9db4-4d22-bd22-aaf2322103df-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.846569 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6169da-9db4-4d22-bd22-aaf2322103df-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.846962 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.847882 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.855994 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.856063 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.856230 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-b2q92" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.895988 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.908442 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f65j5\" (UniqueName: \"kubernetes.io/projected/bb6169da-9db4-4d22-bd22-aaf2322103df-kube-api-access-f65j5\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.908575 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bb6169da-9db4-4d22-bd22-aaf2322103df\") " pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.930890 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.930955 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tzmb\" (UniqueName: \"kubernetes.io/projected/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-kube-api-access-7tzmb\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.931061 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-kolla-config\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.931233 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:15 crc kubenswrapper[4848]: I1206 15:46:15.931311 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-config-data\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.032764 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.032825 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-config-data\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.032861 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.032877 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tzmb\" (UniqueName: \"kubernetes.io/projected/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-kube-api-access-7tzmb\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.032911 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-kolla-config\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.034139 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-config-data\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.034221 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-kolla-config\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.037194 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.037630 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.063392 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tzmb\" (UniqueName: \"kubernetes.io/projected/d62dd990-abf6-47e3-aafd-5e7efb0ab5c6-kube-api-access-7tzmb\") pod \"memcached-0\" (UID: \"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6\") " pod="openstack/memcached-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.188860 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:16 crc kubenswrapper[4848]: I1206 15:46:16.286539 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 15:46:17 crc kubenswrapper[4848]: I1206 15:46:17.828001 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 15:46:17 crc kubenswrapper[4848]: I1206 15:46:17.829142 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 15:46:17 crc kubenswrapper[4848]: I1206 15:46:17.837009 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 15:46:17 crc kubenswrapper[4848]: I1206 15:46:17.841028 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-xmshw" Dec 06 15:46:17 crc kubenswrapper[4848]: I1206 15:46:17.966814 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j47pk\" (UniqueName: \"kubernetes.io/projected/e3135a6e-f317-4bd7-84dd-788f65fc87a0-kube-api-access-j47pk\") pod \"kube-state-metrics-0\" (UID: \"e3135a6e-f317-4bd7-84dd-788f65fc87a0\") " pod="openstack/kube-state-metrics-0" Dec 06 15:46:18 crc kubenswrapper[4848]: I1206 15:46:18.068545 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j47pk\" (UniqueName: \"kubernetes.io/projected/e3135a6e-f317-4bd7-84dd-788f65fc87a0-kube-api-access-j47pk\") pod \"kube-state-metrics-0\" (UID: \"e3135a6e-f317-4bd7-84dd-788f65fc87a0\") " pod="openstack/kube-state-metrics-0" Dec 06 15:46:18 crc kubenswrapper[4848]: I1206 15:46:18.087290 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j47pk\" (UniqueName: \"kubernetes.io/projected/e3135a6e-f317-4bd7-84dd-788f65fc87a0-kube-api-access-j47pk\") pod \"kube-state-metrics-0\" (UID: \"e3135a6e-f317-4bd7-84dd-788f65fc87a0\") " pod="openstack/kube-state-metrics-0" Dec 06 15:46:18 crc kubenswrapper[4848]: I1206 15:46:18.146721 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.342123 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-g6wzf"] Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.347451 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.353336 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-qnr5m" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.353584 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.353897 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.356653 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g6wzf"] Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.365292 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-fcx5h"] Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.368112 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.397373 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fcx5h"] Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.519461 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93c0a1e4-91cd-4801-8439-a41fb872135f-scripts\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.519672 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-etc-ovs\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.519770 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-var-lib\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.519862 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c0a1e4-91cd-4801-8439-a41fb872135f-ovn-controller-tls-certs\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.519922 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn9zx\" (UniqueName: \"kubernetes.io/projected/978d5b2d-7113-4b4e-944a-2681e5da434d-kube-api-access-pn9zx\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.519982 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/978d5b2d-7113-4b4e-944a-2681e5da434d-scripts\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.520016 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/93c0a1e4-91cd-4801-8439-a41fb872135f-var-run\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.520041 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-var-log\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.520064 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-var-run\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.520160 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-944l8\" (UniqueName: \"kubernetes.io/projected/93c0a1e4-91cd-4801-8439-a41fb872135f-kube-api-access-944l8\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.520207 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/93c0a1e4-91cd-4801-8439-a41fb872135f-var-log-ovn\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.520230 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c0a1e4-91cd-4801-8439-a41fb872135f-combined-ca-bundle\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.520290 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/93c0a1e4-91cd-4801-8439-a41fb872135f-var-run-ovn\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.621944 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/978d5b2d-7113-4b4e-944a-2681e5da434d-scripts\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622003 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/93c0a1e4-91cd-4801-8439-a41fb872135f-var-run\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622033 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-var-log\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622247 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-var-run\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622290 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-944l8\" (UniqueName: \"kubernetes.io/projected/93c0a1e4-91cd-4801-8439-a41fb872135f-kube-api-access-944l8\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622317 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/93c0a1e4-91cd-4801-8439-a41fb872135f-var-log-ovn\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622377 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c0a1e4-91cd-4801-8439-a41fb872135f-combined-ca-bundle\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622413 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/93c0a1e4-91cd-4801-8439-a41fb872135f-var-run-ovn\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622453 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93c0a1e4-91cd-4801-8439-a41fb872135f-scripts\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622496 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-etc-ovs\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622523 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-var-lib\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622555 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c0a1e4-91cd-4801-8439-a41fb872135f-ovn-controller-tls-certs\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.622587 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn9zx\" (UniqueName: \"kubernetes.io/projected/978d5b2d-7113-4b4e-944a-2681e5da434d-kube-api-access-pn9zx\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.623475 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/93c0a1e4-91cd-4801-8439-a41fb872135f-var-log-ovn\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.623682 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-var-run\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.623718 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/93c0a1e4-91cd-4801-8439-a41fb872135f-var-run\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.623763 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/93c0a1e4-91cd-4801-8439-a41fb872135f-var-run-ovn\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.623779 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-var-log\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.623899 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-var-lib\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.624042 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/978d5b2d-7113-4b4e-944a-2681e5da434d-etc-ovs\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.626032 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/978d5b2d-7113-4b4e-944a-2681e5da434d-scripts\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.626276 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93c0a1e4-91cd-4801-8439-a41fb872135f-scripts\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.633013 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c0a1e4-91cd-4801-8439-a41fb872135f-ovn-controller-tls-certs\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.633361 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c0a1e4-91cd-4801-8439-a41fb872135f-combined-ca-bundle\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.645573 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn9zx\" (UniqueName: \"kubernetes.io/projected/978d5b2d-7113-4b4e-944a-2681e5da434d-kube-api-access-pn9zx\") pod \"ovn-controller-ovs-fcx5h\" (UID: \"978d5b2d-7113-4b4e-944a-2681e5da434d\") " pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.646426 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-944l8\" (UniqueName: \"kubernetes.io/projected/93c0a1e4-91cd-4801-8439-a41fb872135f-kube-api-access-944l8\") pod \"ovn-controller-g6wzf\" (UID: \"93c0a1e4-91cd-4801-8439-a41fb872135f\") " pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.697602 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:20 crc kubenswrapper[4848]: I1206 15:46:20.706002 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.834371 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.836228 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.840779 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dcr6q" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.849119 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.853467 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.853605 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.853773 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.862393 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.976421 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2928825b-3e1c-48cd-827e-afad27fe84c1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.976475 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.976541 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2928825b-3e1c-48cd-827e-afad27fe84c1-config\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.976562 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2928825b-3e1c-48cd-827e-afad27fe84c1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.976599 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2928825b-3e1c-48cd-827e-afad27fe84c1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.976615 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2928825b-3e1c-48cd-827e-afad27fe84c1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.976638 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2928825b-3e1c-48cd-827e-afad27fe84c1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:23 crc kubenswrapper[4848]: I1206 15:46:23.976657 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6rks\" (UniqueName: \"kubernetes.io/projected/2928825b-3e1c-48cd-827e-afad27fe84c1-kube-api-access-l6rks\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.078125 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.079276 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.080948 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2928825b-3e1c-48cd-827e-afad27fe84c1-config\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.080990 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2928825b-3e1c-48cd-827e-afad27fe84c1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.081095 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2928825b-3e1c-48cd-827e-afad27fe84c1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.081116 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2928825b-3e1c-48cd-827e-afad27fe84c1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.081163 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2928825b-3e1c-48cd-827e-afad27fe84c1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.081227 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6rks\" (UniqueName: \"kubernetes.io/projected/2928825b-3e1c-48cd-827e-afad27fe84c1-kube-api-access-l6rks\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.081332 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2928825b-3e1c-48cd-827e-afad27fe84c1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.082722 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2928825b-3e1c-48cd-827e-afad27fe84c1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.083873 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2928825b-3e1c-48cd-827e-afad27fe84c1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.083934 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2928825b-3e1c-48cd-827e-afad27fe84c1-config\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.089868 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2928825b-3e1c-48cd-827e-afad27fe84c1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.090978 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2928825b-3e1c-48cd-827e-afad27fe84c1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.091643 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2928825b-3e1c-48cd-827e-afad27fe84c1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.108788 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6rks\" (UniqueName: \"kubernetes.io/projected/2928825b-3e1c-48cd-827e-afad27fe84c1-kube-api-access-l6rks\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.127501 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2928825b-3e1c-48cd-827e-afad27fe84c1\") " pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:24 crc kubenswrapper[4848]: I1206 15:46:24.176469 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.262662 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.265375 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.268741 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.268944 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.269458 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-mh8pd" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.270123 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.275036 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.401971 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.402039 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.402108 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.402150 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.402167 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.402196 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.402215 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5cql\" (UniqueName: \"kubernetes.io/projected/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-kube-api-access-b5cql\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.402238 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.503367 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.503445 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.503485 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.503521 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.503537 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.503568 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.503588 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5cql\" (UniqueName: \"kubernetes.io/projected/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-kube-api-access-b5cql\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.503613 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.503961 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.504380 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.505041 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.505414 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.507876 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.513167 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.513215 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.520158 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5cql\" (UniqueName: \"kubernetes.io/projected/2d723dc9-fd9d-4b78-9fa6-c18e8656f634-kube-api-access-b5cql\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.523848 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d723dc9-fd9d-4b78-9fa6-c18e8656f634\") " pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: I1206 15:46:25.602811 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:25 crc kubenswrapper[4848]: E1206 15:46:25.769488 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 06 15:46:25 crc kubenswrapper[4848]: E1206 15:46:25.769637 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fhtp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-6z85k_openstack(a19cfcdb-e386-4dc3-a94c-095985b64a1d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 15:46:25 crc kubenswrapper[4848]: E1206 15:46:25.770853 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" podUID="a19cfcdb-e386-4dc3-a94c-095985b64a1d" Dec 06 15:46:25 crc kubenswrapper[4848]: E1206 15:46:25.794505 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 06 15:46:25 crc kubenswrapper[4848]: E1206 15:46:25.795524 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbsv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-26j2b_openstack(92c91026-c3e6-4432-8433-5bd51e9288a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 15:46:25 crc kubenswrapper[4848]: E1206 15:46:25.796728 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" podUID="92c91026-c3e6-4432-8433-5bd51e9288a0" Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.173299 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.195973 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 15:46:26 crc kubenswrapper[4848]: W1206 15:46:26.201556 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddda76265_1c2c_4409_8460_99bc3ab509c6.slice/crio-8db94531ef80ba7234486ee2fa33fa133112c2b769013f0ce9cc6ede377335bb WatchSource:0}: Error finding container 8db94531ef80ba7234486ee2fa33fa133112c2b769013f0ce9cc6ede377335bb: Status 404 returned error can't find the container with id 8db94531ef80ba7234486ee2fa33fa133112c2b769013f0ce9cc6ede377335bb Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.339307 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 15:46:26 crc kubenswrapper[4848]: W1206 15:46:26.348408 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd62dd990_abf6_47e3_aafd_5e7efb0ab5c6.slice/crio-4fc0a772c6923da6bcc9f009fec693ea8ed0d7ea398e3d1bb68497f39ab25b3c WatchSource:0}: Error finding container 4fc0a772c6923da6bcc9f009fec693ea8ed0d7ea398e3d1bb68497f39ab25b3c: Status 404 returned error can't find the container with id 4fc0a772c6923da6bcc9f009fec693ea8ed0d7ea398e3d1bb68497f39ab25b3c Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.407833 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g6wzf"] Dec 06 15:46:26 crc kubenswrapper[4848]: W1206 15:46:26.431547 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3135a6e_f317_4bd7_84dd_788f65fc87a0.slice/crio-ef63aa2111c7d56a75863302b8aa03ba21b70b5591d23fdb14c85de8ad81c276 WatchSource:0}: Error finding container ef63aa2111c7d56a75863302b8aa03ba21b70b5591d23fdb14c85de8ad81c276: Status 404 returned error can't find the container with id ef63aa2111c7d56a75863302b8aa03ba21b70b5591d23fdb14c85de8ad81c276 Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.432963 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 15:46:26 crc kubenswrapper[4848]: W1206 15:46:26.433170 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93c0a1e4_91cd_4801_8439_a41fb872135f.slice/crio-8cda39102deba505eadd5ef46530eb1bd721d6385d154cffad325a636f14a965 WatchSource:0}: Error finding container 8cda39102deba505eadd5ef46530eb1bd721d6385d154cffad325a636f14a965: Status 404 returned error can't find the container with id 8cda39102deba505eadd5ef46530eb1bd721d6385d154cffad325a636f14a965 Dec 06 15:46:26 crc kubenswrapper[4848]: W1206 15:46:26.435172 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1c66a34_c907_4841_92b1_0799522b6bd5.slice/crio-abe4164a2d18dc99db5fc65ecf84ca1b08a608f35b85b38cd5697f4ac95b0a7d WatchSource:0}: Error finding container abe4164a2d18dc99db5fc65ecf84ca1b08a608f35b85b38cd5697f4ac95b0a7d: Status 404 returned error can't find the container with id abe4164a2d18dc99db5fc65ecf84ca1b08a608f35b85b38cd5697f4ac95b0a7d Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.440646 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.496684 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fcx5h"] Dec 06 15:46:26 crc kubenswrapper[4848]: W1206 15:46:26.497398 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod978d5b2d_7113_4b4e_944a_2681e5da434d.slice/crio-d4dfbe87929811f85bb1db757602e03507c2ad630932bbf7210840b54a412872 WatchSource:0}: Error finding container d4dfbe87929811f85bb1db757602e03507c2ad630932bbf7210840b54a412872: Status 404 returned error can't find the container with id d4dfbe87929811f85bb1db757602e03507c2ad630932bbf7210840b54a412872 Dec 06 15:46:26 crc kubenswrapper[4848]: W1206 15:46:26.667156 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d723dc9_fd9d_4b78_9fa6_c18e8656f634.slice/crio-1d3f98a537e81287594883b503095d467d107037679917d577502471bcfdcaf0 WatchSource:0}: Error finding container 1d3f98a537e81287594883b503095d467d107037679917d577502471bcfdcaf0: Status 404 returned error can't find the container with id 1d3f98a537e81287594883b503095d467d107037679917d577502471bcfdcaf0 Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.669331 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.699777 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6","Type":"ContainerStarted","Data":"4fc0a772c6923da6bcc9f009fec693ea8ed0d7ea398e3d1bb68497f39ab25b3c"} Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.701619 4848 generic.go:334] "Generic (PLEG): container finished" podID="a8715bfa-e1a1-4467-8543-6807e7facc8e" containerID="fa6230deb6478b4de3cb2ee468325bb402cee65204f79bed702f2bb65a86e017" exitCode=0 Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.701666 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" event={"ID":"a8715bfa-e1a1-4467-8543-6807e7facc8e","Type":"ContainerDied","Data":"fa6230deb6478b4de3cb2ee468325bb402cee65204f79bed702f2bb65a86e017"} Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.703077 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a1c66a34-c907-4841-92b1-0799522b6bd5","Type":"ContainerStarted","Data":"abe4164a2d18dc99db5fc65ecf84ca1b08a608f35b85b38cd5697f4ac95b0a7d"} Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.704037 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g6wzf" event={"ID":"93c0a1e4-91cd-4801-8439-a41fb872135f","Type":"ContainerStarted","Data":"8cda39102deba505eadd5ef46530eb1bd721d6385d154cffad325a636f14a965"} Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.707210 4848 generic.go:334] "Generic (PLEG): container finished" podID="401b18b9-ead2-44fb-b61e-6ece6116f6f1" containerID="4cc2f4c57a2b2ddcf52a789daf50288bdf8aa2045f2a1d3fe21ad611d97ff62a" exitCode=0 Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.707269 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" event={"ID":"401b18b9-ead2-44fb-b61e-6ece6116f6f1","Type":"ContainerDied","Data":"4cc2f4c57a2b2ddcf52a789daf50288bdf8aa2045f2a1d3fe21ad611d97ff62a"} Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.708526 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d723dc9-fd9d-4b78-9fa6-c18e8656f634","Type":"ContainerStarted","Data":"1d3f98a537e81287594883b503095d467d107037679917d577502471bcfdcaf0"} Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.709520 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bb6169da-9db4-4d22-bd22-aaf2322103df","Type":"ContainerStarted","Data":"819143a3c245934d6460eb6d9a319f4a79a6702374c28fa7f4b92feacfbe628b"} Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.713196 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e3135a6e-f317-4bd7-84dd-788f65fc87a0","Type":"ContainerStarted","Data":"ef63aa2111c7d56a75863302b8aa03ba21b70b5591d23fdb14c85de8ad81c276"} Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.714446 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dda76265-1c2c-4409-8460-99bc3ab509c6","Type":"ContainerStarted","Data":"8db94531ef80ba7234486ee2fa33fa133112c2b769013f0ce9cc6ede377335bb"} Dec 06 15:46:26 crc kubenswrapper[4848]: I1206 15:46:26.719115 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fcx5h" event={"ID":"978d5b2d-7113-4b4e-944a-2681e5da434d","Type":"ContainerStarted","Data":"d4dfbe87929811f85bb1db757602e03507c2ad630932bbf7210840b54a412872"} Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.053575 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.192233 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.196593 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.343803 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-dns-svc\") pod \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.343865 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92c91026-c3e6-4432-8433-5bd51e9288a0-config\") pod \"92c91026-c3e6-4432-8433-5bd51e9288a0\" (UID: \"92c91026-c3e6-4432-8433-5bd51e9288a0\") " Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.343950 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbsv6\" (UniqueName: \"kubernetes.io/projected/92c91026-c3e6-4432-8433-5bd51e9288a0-kube-api-access-qbsv6\") pod \"92c91026-c3e6-4432-8433-5bd51e9288a0\" (UID: \"92c91026-c3e6-4432-8433-5bd51e9288a0\") " Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.343987 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhtp9\" (UniqueName: \"kubernetes.io/projected/a19cfcdb-e386-4dc3-a94c-095985b64a1d-kube-api-access-fhtp9\") pod \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.344063 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-config\") pod \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\" (UID: \"a19cfcdb-e386-4dc3-a94c-095985b64a1d\") " Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.346027 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92c91026-c3e6-4432-8433-5bd51e9288a0-config" (OuterVolumeSpecName: "config") pod "92c91026-c3e6-4432-8433-5bd51e9288a0" (UID: "92c91026-c3e6-4432-8433-5bd51e9288a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.346025 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a19cfcdb-e386-4dc3-a94c-095985b64a1d" (UID: "a19cfcdb-e386-4dc3-a94c-095985b64a1d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.346533 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-config" (OuterVolumeSpecName: "config") pod "a19cfcdb-e386-4dc3-a94c-095985b64a1d" (UID: "a19cfcdb-e386-4dc3-a94c-095985b64a1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.349991 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92c91026-c3e6-4432-8433-5bd51e9288a0-kube-api-access-qbsv6" (OuterVolumeSpecName: "kube-api-access-qbsv6") pod "92c91026-c3e6-4432-8433-5bd51e9288a0" (UID: "92c91026-c3e6-4432-8433-5bd51e9288a0"). InnerVolumeSpecName "kube-api-access-qbsv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.350368 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19cfcdb-e386-4dc3-a94c-095985b64a1d-kube-api-access-fhtp9" (OuterVolumeSpecName: "kube-api-access-fhtp9") pod "a19cfcdb-e386-4dc3-a94c-095985b64a1d" (UID: "a19cfcdb-e386-4dc3-a94c-095985b64a1d"). InnerVolumeSpecName "kube-api-access-fhtp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.446272 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbsv6\" (UniqueName: \"kubernetes.io/projected/92c91026-c3e6-4432-8433-5bd51e9288a0-kube-api-access-qbsv6\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.446303 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhtp9\" (UniqueName: \"kubernetes.io/projected/a19cfcdb-e386-4dc3-a94c-095985b64a1d-kube-api-access-fhtp9\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.446314 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.446322 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a19cfcdb-e386-4dc3-a94c-095985b64a1d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.446331 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92c91026-c3e6-4432-8433-5bd51e9288a0-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.727990 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" event={"ID":"a8715bfa-e1a1-4467-8543-6807e7facc8e","Type":"ContainerStarted","Data":"de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808"} Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.728334 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.731092 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75","Type":"ContainerStarted","Data":"ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade"} Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.734944 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" event={"ID":"92c91026-c3e6-4432-8433-5bd51e9288a0","Type":"ContainerDied","Data":"8d84d79e605a28bfeadb386edd348e7ed889d8e043c3e47d4e5d80b97b5524eb"} Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.735002 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-26j2b" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.750596 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" event={"ID":"401b18b9-ead2-44fb-b61e-6ece6116f6f1","Type":"ContainerStarted","Data":"5385a0144b02003f44c302a543a37b14e3de6006b399fed7533e6240136767b1"} Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.750838 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.752692 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.752724 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6z85k" event={"ID":"a19cfcdb-e386-4dc3-a94c-095985b64a1d","Type":"ContainerDied","Data":"fdb7c65d88af31ffcb7ec8e7faa88dad6fc9cd8e24a7b6daf0139b738c3908db"} Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.755207 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2928825b-3e1c-48cd-827e-afad27fe84c1","Type":"ContainerStarted","Data":"217a35558dc9ab7ab3e4648b3ced818c0a612302eb78e42837c97e3020372e76"} Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.755337 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" podStartSLOduration=3.902025744 podStartE2EDuration="16.755321258s" podCreationTimestamp="2025-12-06 15:46:11 +0000 UTC" firstStartedPulling="2025-12-06 15:46:13.062685995 +0000 UTC m=+1040.360696908" lastFinishedPulling="2025-12-06 15:46:25.915981499 +0000 UTC m=+1053.213992422" observedRunningTime="2025-12-06 15:46:27.744527946 +0000 UTC m=+1055.042538859" watchObservedRunningTime="2025-12-06 15:46:27.755321258 +0000 UTC m=+1055.053332171" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.772477 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dda76265-1c2c-4409-8460-99bc3ab509c6","Type":"ContainerStarted","Data":"2d8388c8c09ab32d2f3f332f8e780dac4da735bd62c80866b31cece9902797ce"} Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.798713 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" podStartSLOduration=3.517963698 podStartE2EDuration="16.798661113s" podCreationTimestamp="2025-12-06 15:46:11 +0000 UTC" firstStartedPulling="2025-12-06 15:46:12.608821507 +0000 UTC m=+1039.906832420" lastFinishedPulling="2025-12-06 15:46:25.889518932 +0000 UTC m=+1053.187529835" observedRunningTime="2025-12-06 15:46:27.783898232 +0000 UTC m=+1055.081909165" watchObservedRunningTime="2025-12-06 15:46:27.798661113 +0000 UTC m=+1055.096672026" Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.860363 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6z85k"] Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.870932 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6z85k"] Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.888677 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-26j2b"] Dec 06 15:46:27 crc kubenswrapper[4848]: I1206 15:46:27.901620 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-26j2b"] Dec 06 15:46:28 crc kubenswrapper[4848]: I1206 15:46:28.977820 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92c91026-c3e6-4432-8433-5bd51e9288a0" path="/var/lib/kubelet/pods/92c91026-c3e6-4432-8433-5bd51e9288a0/volumes" Dec 06 15:46:28 crc kubenswrapper[4848]: I1206 15:46:28.978267 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a19cfcdb-e386-4dc3-a94c-095985b64a1d" path="/var/lib/kubelet/pods/a19cfcdb-e386-4dc3-a94c-095985b64a1d/volumes" Dec 06 15:46:32 crc kubenswrapper[4848]: I1206 15:46:32.569891 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:32 crc kubenswrapper[4848]: I1206 15:46:32.626406 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9msbr"] Dec 06 15:46:32 crc kubenswrapper[4848]: I1206 15:46:32.628376 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" podUID="401b18b9-ead2-44fb-b61e-6ece6116f6f1" containerName="dnsmasq-dns" containerID="cri-o://5385a0144b02003f44c302a543a37b14e3de6006b399fed7533e6240136767b1" gracePeriod=10 Dec 06 15:46:32 crc kubenswrapper[4848]: I1206 15:46:32.633856 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:32 crc kubenswrapper[4848]: I1206 15:46:32.815404 4848 generic.go:334] "Generic (PLEG): container finished" podID="401b18b9-ead2-44fb-b61e-6ece6116f6f1" containerID="5385a0144b02003f44c302a543a37b14e3de6006b399fed7533e6240136767b1" exitCode=0 Dec 06 15:46:32 crc kubenswrapper[4848]: I1206 15:46:32.815455 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" event={"ID":"401b18b9-ead2-44fb-b61e-6ece6116f6f1","Type":"ContainerDied","Data":"5385a0144b02003f44c302a543a37b14e3de6006b399fed7533e6240136767b1"} Dec 06 15:46:37 crc kubenswrapper[4848]: I1206 15:46:37.932186 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.005794 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvjcv\" (UniqueName: \"kubernetes.io/projected/401b18b9-ead2-44fb-b61e-6ece6116f6f1-kube-api-access-gvjcv\") pod \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.005852 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-dns-svc\") pod \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.005975 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-config\") pod \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\" (UID: \"401b18b9-ead2-44fb-b61e-6ece6116f6f1\") " Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.014897 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401b18b9-ead2-44fb-b61e-6ece6116f6f1-kube-api-access-gvjcv" (OuterVolumeSpecName: "kube-api-access-gvjcv") pod "401b18b9-ead2-44fb-b61e-6ece6116f6f1" (UID: "401b18b9-ead2-44fb-b61e-6ece6116f6f1"). InnerVolumeSpecName "kube-api-access-gvjcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.055681 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "401b18b9-ead2-44fb-b61e-6ece6116f6f1" (UID: "401b18b9-ead2-44fb-b61e-6ece6116f6f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.055718 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-config" (OuterVolumeSpecName: "config") pod "401b18b9-ead2-44fb-b61e-6ece6116f6f1" (UID: "401b18b9-ead2-44fb-b61e-6ece6116f6f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.108263 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.108323 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvjcv\" (UniqueName: \"kubernetes.io/projected/401b18b9-ead2-44fb-b61e-6ece6116f6f1-kube-api-access-gvjcv\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.108341 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/401b18b9-ead2-44fb-b61e-6ece6116f6f1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.855327 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d62dd990-abf6-47e3-aafd-5e7efb0ab5c6","Type":"ContainerStarted","Data":"3a2ad1eff33dbff4709b738f39b91b8e2e8af92ec98a686af20ead7491ed441d"} Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.855655 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.858978 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d723dc9-fd9d-4b78-9fa6-c18e8656f634","Type":"ContainerStarted","Data":"d3ed2cdb277d4308795efa3c6ccd636bff4d2db5851a99510f37a71a6eaccebf"} Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.861385 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bb6169da-9db4-4d22-bd22-aaf2322103df","Type":"ContainerStarted","Data":"b94967d0ee70e3eeef186432061ba2f5093e764204852d41b0fe3b8c1665732a"} Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.866405 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e3135a6e-f317-4bd7-84dd-788f65fc87a0","Type":"ContainerStarted","Data":"4b46177a45d811e1bd3adc40ee2a136ca05f1ba1780b017404fe23ff2e72d937"} Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.866613 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.869733 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2928825b-3e1c-48cd-827e-afad27fe84c1","Type":"ContainerStarted","Data":"1a1bd1c684fd54767d3883b59b41806f416d423d692911dbc458bc55c7047c5b"} Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.879131 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.16305315 podStartE2EDuration="23.879112279s" podCreationTimestamp="2025-12-06 15:46:15 +0000 UTC" firstStartedPulling="2025-12-06 15:46:26.350284387 +0000 UTC m=+1053.648295300" lastFinishedPulling="2025-12-06 15:46:33.066343516 +0000 UTC m=+1060.364354429" observedRunningTime="2025-12-06 15:46:38.872922052 +0000 UTC m=+1066.170932965" watchObservedRunningTime="2025-12-06 15:46:38.879112279 +0000 UTC m=+1066.177123192" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.879187 4848 generic.go:334] "Generic (PLEG): container finished" podID="978d5b2d-7113-4b4e-944a-2681e5da434d" containerID="ac020e3c240b1bb9f3707929a395b72cc45c4ab343bccb2f5ab72d06a9cbf677" exitCode=0 Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.879266 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fcx5h" event={"ID":"978d5b2d-7113-4b4e-944a-2681e5da434d","Type":"ContainerDied","Data":"ac020e3c240b1bb9f3707929a395b72cc45c4ab343bccb2f5ab72d06a9cbf677"} Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.882804 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a1c66a34-c907-4841-92b1-0799522b6bd5","Type":"ContainerStarted","Data":"60b73bc0ccd0a606af5e09f3aee3fbd2c62659086492c9988957ffbb8d00bdd9"} Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.885311 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g6wzf" event={"ID":"93c0a1e4-91cd-4801-8439-a41fb872135f","Type":"ContainerStarted","Data":"abfc87ddb14fc6a6c9bea40af994ae8c20027671920b0d716e29cf1112e07450"} Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.885854 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-g6wzf" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.888904 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" event={"ID":"401b18b9-ead2-44fb-b61e-6ece6116f6f1","Type":"ContainerDied","Data":"75cc376258ccf94239e73ae28a2c5c1377131f38525f00b9e61002e3b9d6bf8f"} Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.888941 4848 scope.go:117] "RemoveContainer" containerID="5385a0144b02003f44c302a543a37b14e3de6006b399fed7533e6240136767b1" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.889066 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.919853 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.440717794 podStartE2EDuration="21.919831823s" podCreationTimestamp="2025-12-06 15:46:17 +0000 UTC" firstStartedPulling="2025-12-06 15:46:26.436215206 +0000 UTC m=+1053.734226119" lastFinishedPulling="2025-12-06 15:46:37.915329235 +0000 UTC m=+1065.213340148" observedRunningTime="2025-12-06 15:46:38.910677875 +0000 UTC m=+1066.208688788" watchObservedRunningTime="2025-12-06 15:46:38.919831823 +0000 UTC m=+1066.217842736" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.959542 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-g6wzf" podStartSLOduration=7.540615702 podStartE2EDuration="18.959522179s" podCreationTimestamp="2025-12-06 15:46:20 +0000 UTC" firstStartedPulling="2025-12-06 15:46:26.436579576 +0000 UTC m=+1053.734590489" lastFinishedPulling="2025-12-06 15:46:37.855486053 +0000 UTC m=+1065.153496966" observedRunningTime="2025-12-06 15:46:38.953449364 +0000 UTC m=+1066.251460287" watchObservedRunningTime="2025-12-06 15:46:38.959522179 +0000 UTC m=+1066.257533092" Dec 06 15:46:38 crc kubenswrapper[4848]: I1206 15:46:38.962315 4848 scope.go:117] "RemoveContainer" containerID="4cc2f4c57a2b2ddcf52a789daf50288bdf8aa2045f2a1d3fe21ad611d97ff62a" Dec 06 15:46:39 crc kubenswrapper[4848]: I1206 15:46:39.012175 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9msbr"] Dec 06 15:46:39 crc kubenswrapper[4848]: I1206 15:46:39.017657 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9msbr"] Dec 06 15:46:39 crc kubenswrapper[4848]: I1206 15:46:39.897716 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fcx5h" event={"ID":"978d5b2d-7113-4b4e-944a-2681e5da434d","Type":"ContainerStarted","Data":"b4abf82ea47d2a3224afd3e79726f42be6634103e6f9f008569394915f30d0be"} Dec 06 15:46:39 crc kubenswrapper[4848]: I1206 15:46:39.898124 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fcx5h" event={"ID":"978d5b2d-7113-4b4e-944a-2681e5da434d","Type":"ContainerStarted","Data":"2ef4d294153c2824943afab77bfe3045c970a06828526ea3ca7f2f72611ed97a"} Dec 06 15:46:39 crc kubenswrapper[4848]: I1206 15:46:39.898872 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:39 crc kubenswrapper[4848]: I1206 15:46:39.899119 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:46:39 crc kubenswrapper[4848]: I1206 15:46:39.923692 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-fcx5h" podStartSLOduration=13.850370251 podStartE2EDuration="19.923673393s" podCreationTimestamp="2025-12-06 15:46:20 +0000 UTC" firstStartedPulling="2025-12-06 15:46:26.499281325 +0000 UTC m=+1053.797292238" lastFinishedPulling="2025-12-06 15:46:32.572584467 +0000 UTC m=+1059.870595380" observedRunningTime="2025-12-06 15:46:39.921181506 +0000 UTC m=+1067.219192419" watchObservedRunningTime="2025-12-06 15:46:39.923673393 +0000 UTC m=+1067.221684306" Dec 06 15:46:40 crc kubenswrapper[4848]: I1206 15:46:40.978053 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401b18b9-ead2-44fb-b61e-6ece6116f6f1" path="/var/lib/kubelet/pods/401b18b9-ead2-44fb-b61e-6ece6116f6f1/volumes" Dec 06 15:46:41 crc kubenswrapper[4848]: I1206 15:46:41.805493 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-9msbr" podUID="401b18b9-ead2-44fb-b61e-6ece6116f6f1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.96:5353: i/o timeout" Dec 06 15:46:41 crc kubenswrapper[4848]: I1206 15:46:41.917947 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2928825b-3e1c-48cd-827e-afad27fe84c1","Type":"ContainerStarted","Data":"afc8635586b23460979372fc2a41618c4ba6e048b31a682ad097a273a3f7ec36"} Dec 06 15:46:41 crc kubenswrapper[4848]: I1206 15:46:41.920343 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d723dc9-fd9d-4b78-9fa6-c18e8656f634","Type":"ContainerStarted","Data":"1e1feaa78b0fac6db13bfa77606a9b5530028c90bf26a058e27babd601b18362"} Dec 06 15:46:41 crc kubenswrapper[4848]: I1206 15:46:41.922661 4848 generic.go:334] "Generic (PLEG): container finished" podID="bb6169da-9db4-4d22-bd22-aaf2322103df" containerID="b94967d0ee70e3eeef186432061ba2f5093e764204852d41b0fe3b8c1665732a" exitCode=0 Dec 06 15:46:41 crc kubenswrapper[4848]: I1206 15:46:41.922730 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bb6169da-9db4-4d22-bd22-aaf2322103df","Type":"ContainerDied","Data":"b94967d0ee70e3eeef186432061ba2f5093e764204852d41b0fe3b8c1665732a"} Dec 06 15:46:41 crc kubenswrapper[4848]: I1206 15:46:41.924401 4848 generic.go:334] "Generic (PLEG): container finished" podID="a1c66a34-c907-4841-92b1-0799522b6bd5" containerID="60b73bc0ccd0a606af5e09f3aee3fbd2c62659086492c9988957ffbb8d00bdd9" exitCode=0 Dec 06 15:46:41 crc kubenswrapper[4848]: I1206 15:46:41.924487 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a1c66a34-c907-4841-92b1-0799522b6bd5","Type":"ContainerDied","Data":"60b73bc0ccd0a606af5e09f3aee3fbd2c62659086492c9988957ffbb8d00bdd9"} Dec 06 15:46:41 crc kubenswrapper[4848]: I1206 15:46:41.966574 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.099989859 podStartE2EDuration="19.966551448s" podCreationTimestamp="2025-12-06 15:46:22 +0000 UTC" firstStartedPulling="2025-12-06 15:46:27.061033576 +0000 UTC m=+1054.359044489" lastFinishedPulling="2025-12-06 15:46:40.927595165 +0000 UTC m=+1068.225606078" observedRunningTime="2025-12-06 15:46:41.943987235 +0000 UTC m=+1069.241998158" watchObservedRunningTime="2025-12-06 15:46:41.966551448 +0000 UTC m=+1069.264562361" Dec 06 15:46:41 crc kubenswrapper[4848]: I1206 15:46:41.990529 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.721875572 podStartE2EDuration="17.990502307s" podCreationTimestamp="2025-12-06 15:46:24 +0000 UTC" firstStartedPulling="2025-12-06 15:46:26.670785081 +0000 UTC m=+1053.968795994" lastFinishedPulling="2025-12-06 15:46:40.939411816 +0000 UTC m=+1068.237422729" observedRunningTime="2025-12-06 15:46:41.986827357 +0000 UTC m=+1069.284838270" watchObservedRunningTime="2025-12-06 15:46:41.990502307 +0000 UTC m=+1069.288513220" Dec 06 15:46:42 crc kubenswrapper[4848]: I1206 15:46:42.177196 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:42 crc kubenswrapper[4848]: I1206 15:46:42.223983 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:42 crc kubenswrapper[4848]: I1206 15:46:42.935116 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a1c66a34-c907-4841-92b1-0799522b6bd5","Type":"ContainerStarted","Data":"c609bdc82ed08946d9812dccb33d483cf05b25e8f50bd4938481096af3551564"} Dec 06 15:46:42 crc kubenswrapper[4848]: I1206 15:46:42.937979 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bb6169da-9db4-4d22-bd22-aaf2322103df","Type":"ContainerStarted","Data":"03c8f6e5dba9efca3bb21c22b6e38925bdef7185999bbfb49e8cd31d52201b14"} Dec 06 15:46:42 crc kubenswrapper[4848]: I1206 15:46:42.938140 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:42 crc kubenswrapper[4848]: I1206 15:46:42.958024 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.540321837 podStartE2EDuration="29.958003242s" podCreationTimestamp="2025-12-06 15:46:13 +0000 UTC" firstStartedPulling="2025-12-06 15:46:26.43785963 +0000 UTC m=+1053.735870543" lastFinishedPulling="2025-12-06 15:46:37.855541035 +0000 UTC m=+1065.153551948" observedRunningTime="2025-12-06 15:46:42.952344249 +0000 UTC m=+1070.250355162" watchObservedRunningTime="2025-12-06 15:46:42.958003242 +0000 UTC m=+1070.256014155" Dec 06 15:46:43 crc kubenswrapper[4848]: I1206 15:46:43.603479 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:43 crc kubenswrapper[4848]: I1206 15:46:43.678264 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:43 crc kubenswrapper[4848]: I1206 15:46:43.708528 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.992578161 podStartE2EDuration="29.708508947s" podCreationTimestamp="2025-12-06 15:46:14 +0000 UTC" firstStartedPulling="2025-12-06 15:46:26.196242763 +0000 UTC m=+1053.494253676" lastFinishedPulling="2025-12-06 15:46:32.912173559 +0000 UTC m=+1060.210184462" observedRunningTime="2025-12-06 15:46:42.978713902 +0000 UTC m=+1070.276724835" watchObservedRunningTime="2025-12-06 15:46:43.708508947 +0000 UTC m=+1071.006519870" Dec 06 15:46:43 crc kubenswrapper[4848]: I1206 15:46:43.944992 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:43 crc kubenswrapper[4848]: I1206 15:46:43.984102 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 06 15:46:43 crc kubenswrapper[4848]: I1206 15:46:43.988838 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.158679 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-5h855"] Dec 06 15:46:44 crc kubenswrapper[4848]: E1206 15:46:44.159124 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401b18b9-ead2-44fb-b61e-6ece6116f6f1" containerName="init" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.159144 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="401b18b9-ead2-44fb-b61e-6ece6116f6f1" containerName="init" Dec 06 15:46:44 crc kubenswrapper[4848]: E1206 15:46:44.159162 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401b18b9-ead2-44fb-b61e-6ece6116f6f1" containerName="dnsmasq-dns" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.159171 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="401b18b9-ead2-44fb-b61e-6ece6116f6f1" containerName="dnsmasq-dns" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.159353 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="401b18b9-ead2-44fb-b61e-6ece6116f6f1" containerName="dnsmasq-dns" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.160336 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.162493 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.175940 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-5h855"] Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.195324 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-fl4pk"] Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.196514 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.198628 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.211382 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fl4pk"] Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.300595 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-5h855"] Dec 06 15:46:44 crc kubenswrapper[4848]: E1206 15:46:44.301112 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-tjrsc ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" podUID="304b22ad-c499-4968-9ca6-5b077d8ff0ab" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.302500 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/28348623-0697-417e-8f17-de443d77348c-ovn-rundir\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.302540 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28348623-0697-417e-8f17-de443d77348c-config\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.302559 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28348623-0697-417e-8f17-de443d77348c-combined-ca-bundle\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.302603 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8v4s\" (UniqueName: \"kubernetes.io/projected/28348623-0697-417e-8f17-de443d77348c-kube-api-access-z8v4s\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.302631 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.302655 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/28348623-0697-417e-8f17-de443d77348c-ovs-rundir\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.302682 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.302715 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28348623-0697-417e-8f17-de443d77348c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.302751 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-config\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.302909 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjrsc\" (UniqueName: \"kubernetes.io/projected/304b22ad-c499-4968-9ca6-5b077d8ff0ab-kube-api-access-tjrsc\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.319560 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-cj66w"] Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.320773 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.327894 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.337122 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-cj66w"] Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.378418 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.379715 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.381677 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.381916 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.382069 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.382212 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-fbq4r" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.397067 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.404646 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8tng\" (UniqueName: \"kubernetes.io/projected/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-kube-api-access-z8tng\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.404718 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/28348623-0697-417e-8f17-de443d77348c-ovs-rundir\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.404793 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.404820 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28348623-0697-417e-8f17-de443d77348c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.404844 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.404880 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-config\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.404919 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-config\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.404953 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjrsc\" (UniqueName: \"kubernetes.io/projected/304b22ad-c499-4968-9ca6-5b077d8ff0ab-kube-api-access-tjrsc\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.404988 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-dns-svc\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.405011 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/28348623-0697-417e-8f17-de443d77348c-ovn-rundir\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.405049 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.405071 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28348623-0697-417e-8f17-de443d77348c-config\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.405094 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28348623-0697-417e-8f17-de443d77348c-combined-ca-bundle\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.405162 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8v4s\" (UniqueName: \"kubernetes.io/projected/28348623-0697-417e-8f17-de443d77348c-kube-api-access-z8v4s\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.405214 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.405641 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/28348623-0697-417e-8f17-de443d77348c-ovs-rundir\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.405719 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/28348623-0697-417e-8f17-de443d77348c-ovn-rundir\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.405864 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.406046 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.406256 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-config\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.406838 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28348623-0697-417e-8f17-de443d77348c-config\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.414799 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28348623-0697-417e-8f17-de443d77348c-combined-ca-bundle\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.420992 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28348623-0697-417e-8f17-de443d77348c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.424788 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8v4s\" (UniqueName: \"kubernetes.io/projected/28348623-0697-417e-8f17-de443d77348c-kube-api-access-z8v4s\") pod \"ovn-controller-metrics-fl4pk\" (UID: \"28348623-0697-417e-8f17-de443d77348c\") " pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.427417 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjrsc\" (UniqueName: \"kubernetes.io/projected/304b22ad-c499-4968-9ca6-5b077d8ff0ab-kube-api-access-tjrsc\") pod \"dnsmasq-dns-5bf47b49b7-5h855\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.506949 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a22b86-df7e-4426-aa1c-3f8c21c02354-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.507015 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f9bb\" (UniqueName: \"kubernetes.io/projected/04a22b86-df7e-4426-aa1c-3f8c21c02354-kube-api-access-9f9bb\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.507084 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a22b86-df7e-4426-aa1c-3f8c21c02354-config\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.507159 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a22b86-df7e-4426-aa1c-3f8c21c02354-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.507197 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8tng\" (UniqueName: \"kubernetes.io/projected/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-kube-api-access-z8tng\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.507225 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/04a22b86-df7e-4426-aa1c-3f8c21c02354-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.507251 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a22b86-df7e-4426-aa1c-3f8c21c02354-scripts\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.507286 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.507376 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a22b86-df7e-4426-aa1c-3f8c21c02354-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.507402 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-config\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.507453 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-dns-svc\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.507493 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.508322 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-dns-svc\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.508512 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.508621 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.508637 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-config\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.513529 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fl4pk" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.522818 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8tng\" (UniqueName: \"kubernetes.io/projected/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-kube-api-access-z8tng\") pod \"dnsmasq-dns-8554648995-cj66w\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.598006 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.598128 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.608956 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a22b86-df7e-4426-aa1c-3f8c21c02354-config\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.608999 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a22b86-df7e-4426-aa1c-3f8c21c02354-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.609032 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/04a22b86-df7e-4426-aa1c-3f8c21c02354-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.609056 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a22b86-df7e-4426-aa1c-3f8c21c02354-scripts\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.609092 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a22b86-df7e-4426-aa1c-3f8c21c02354-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.609201 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a22b86-df7e-4426-aa1c-3f8c21c02354-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.609229 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f9bb\" (UniqueName: \"kubernetes.io/projected/04a22b86-df7e-4426-aa1c-3f8c21c02354-kube-api-access-9f9bb\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.611870 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/04a22b86-df7e-4426-aa1c-3f8c21c02354-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.612178 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a22b86-df7e-4426-aa1c-3f8c21c02354-config\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.612188 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a22b86-df7e-4426-aa1c-3f8c21c02354-scripts\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.616096 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a22b86-df7e-4426-aa1c-3f8c21c02354-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.616267 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a22b86-df7e-4426-aa1c-3f8c21c02354-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.616565 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a22b86-df7e-4426-aa1c-3f8c21c02354-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.645733 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.658117 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f9bb\" (UniqueName: \"kubernetes.io/projected/04a22b86-df7e-4426-aa1c-3f8c21c02354-kube-api-access-9f9bb\") pod \"ovn-northd-0\" (UID: \"04a22b86-df7e-4426-aa1c-3f8c21c02354\") " pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.700306 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.955195 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.978828 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:44 crc kubenswrapper[4848]: I1206 15:46:44.982030 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fl4pk"] Dec 06 15:46:44 crc kubenswrapper[4848]: W1206 15:46:44.987505 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28348623_0697_417e_8f17_de443d77348c.slice/crio-d23f0e8a481e1e54dac738919b352239625d4c3d86cc52145d4789451e1ca2f2 WatchSource:0}: Error finding container d23f0e8a481e1e54dac738919b352239625d4c3d86cc52145d4789451e1ca2f2: Status 404 returned error can't find the container with id d23f0e8a481e1e54dac738919b352239625d4c3d86cc52145d4789451e1ca2f2 Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.014483 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-dns-svc\") pod \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.014645 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-ovsdbserver-nb\") pod \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.014730 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjrsc\" (UniqueName: \"kubernetes.io/projected/304b22ad-c499-4968-9ca6-5b077d8ff0ab-kube-api-access-tjrsc\") pod \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.014778 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-config\") pod \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\" (UID: \"304b22ad-c499-4968-9ca6-5b077d8ff0ab\") " Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.014915 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "304b22ad-c499-4968-9ca6-5b077d8ff0ab" (UID: "304b22ad-c499-4968-9ca6-5b077d8ff0ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.015372 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-config" (OuterVolumeSpecName: "config") pod "304b22ad-c499-4968-9ca6-5b077d8ff0ab" (UID: "304b22ad-c499-4968-9ca6-5b077d8ff0ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.015422 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "304b22ad-c499-4968-9ca6-5b077d8ff0ab" (UID: "304b22ad-c499-4968-9ca6-5b077d8ff0ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.016025 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.022989 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/304b22ad-c499-4968-9ca6-5b077d8ff0ab-kube-api-access-tjrsc" (OuterVolumeSpecName: "kube-api-access-tjrsc") pod "304b22ad-c499-4968-9ca6-5b077d8ff0ab" (UID: "304b22ad-c499-4968-9ca6-5b077d8ff0ab"). InnerVolumeSpecName "kube-api-access-tjrsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.083836 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-cj66w"] Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.117362 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.117406 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjrsc\" (UniqueName: \"kubernetes.io/projected/304b22ad-c499-4968-9ca6-5b077d8ff0ab-kube-api-access-tjrsc\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.117422 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304b22ad-c499-4968-9ca6-5b077d8ff0ab-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.200802 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.965906 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fl4pk" event={"ID":"28348623-0697-417e-8f17-de443d77348c","Type":"ContainerStarted","Data":"4b937293cf849c75edf15c727ecc3632ea492bfd32b5d63c820c487169ec8144"} Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.965996 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fl4pk" event={"ID":"28348623-0697-417e-8f17-de443d77348c","Type":"ContainerStarted","Data":"d23f0e8a481e1e54dac738919b352239625d4c3d86cc52145d4789451e1ca2f2"} Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.968879 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"04a22b86-df7e-4426-aa1c-3f8c21c02354","Type":"ContainerStarted","Data":"4019e71e6c7d33fc553528b75925b3c4bf7d91e7440eb84fb9d8cbf0433b1cbb"} Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.972242 4848 generic.go:334] "Generic (PLEG): container finished" podID="04b11548-5fac-4e9f-87ea-e9f0c8e4badc" containerID="e4f35468458de2019ddb3bcc154c2f36c862c5bbf5329385c0353d37ad84476f" exitCode=0 Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.972383 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-5h855" Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.974606 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-cj66w" event={"ID":"04b11548-5fac-4e9f-87ea-e9f0c8e4badc","Type":"ContainerDied","Data":"e4f35468458de2019ddb3bcc154c2f36c862c5bbf5329385c0353d37ad84476f"} Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.974652 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-cj66w" event={"ID":"04b11548-5fac-4e9f-87ea-e9f0c8e4badc","Type":"ContainerStarted","Data":"b48316f11b86eb0612bf24cdb2251ceba6b2e43043c904155de875f726779099"} Dec 06 15:46:45 crc kubenswrapper[4848]: I1206 15:46:45.992007 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-fl4pk" podStartSLOduration=1.991981571 podStartE2EDuration="1.991981571s" podCreationTimestamp="2025-12-06 15:46:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:46:45.989133692 +0000 UTC m=+1073.287144605" watchObservedRunningTime="2025-12-06 15:46:45.991981571 +0000 UTC m=+1073.289992484" Dec 06 15:46:46 crc kubenswrapper[4848]: I1206 15:46:46.037841 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-5h855"] Dec 06 15:46:46 crc kubenswrapper[4848]: I1206 15:46:46.052972 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-5h855"] Dec 06 15:46:46 crc kubenswrapper[4848]: I1206 15:46:46.189958 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:46 crc kubenswrapper[4848]: I1206 15:46:46.190289 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:46 crc kubenswrapper[4848]: I1206 15:46:46.289891 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 06 15:46:46 crc kubenswrapper[4848]: I1206 15:46:46.975643 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="304b22ad-c499-4968-9ca6-5b077d8ff0ab" path="/var/lib/kubelet/pods/304b22ad-c499-4968-9ca6-5b077d8ff0ab/volumes" Dec 06 15:46:46 crc kubenswrapper[4848]: I1206 15:46:46.980646 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"04a22b86-df7e-4426-aa1c-3f8c21c02354","Type":"ContainerStarted","Data":"d9fe9f3e610ca1c0cbe5e326585cf2e2a1e58068eb3a86ec2efa05d49996df6b"} Dec 06 15:46:46 crc kubenswrapper[4848]: I1206 15:46:46.980687 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"04a22b86-df7e-4426-aa1c-3f8c21c02354","Type":"ContainerStarted","Data":"df786414a4091d1b49a19dd5cc4dc8ddcb715ace62b171fd04c122afc527c951"} Dec 06 15:46:46 crc kubenswrapper[4848]: I1206 15:46:46.980977 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 06 15:46:46 crc kubenswrapper[4848]: I1206 15:46:46.982891 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-cj66w" event={"ID":"04b11548-5fac-4e9f-87ea-e9f0c8e4badc","Type":"ContainerStarted","Data":"b25d26cf886f62adf48bc34fa8580376c1aa5610863ca66ae0971336ffc392c8"} Dec 06 15:46:46 crc kubenswrapper[4848]: I1206 15:46:46.983671 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:47 crc kubenswrapper[4848]: I1206 15:46:47.007282 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.991431307 podStartE2EDuration="3.007262651s" podCreationTimestamp="2025-12-06 15:46:44 +0000 UTC" firstStartedPulling="2025-12-06 15:46:45.222523831 +0000 UTC m=+1072.520534744" lastFinishedPulling="2025-12-06 15:46:46.238355175 +0000 UTC m=+1073.536366088" observedRunningTime="2025-12-06 15:46:47.00504757 +0000 UTC m=+1074.303058483" watchObservedRunningTime="2025-12-06 15:46:47.007262651 +0000 UTC m=+1074.305273564" Dec 06 15:46:47 crc kubenswrapper[4848]: I1206 15:46:47.021578 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-cj66w" podStartSLOduration=3.021552568 podStartE2EDuration="3.021552568s" podCreationTimestamp="2025-12-06 15:46:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:46:47.021107576 +0000 UTC m=+1074.319118489" watchObservedRunningTime="2025-12-06 15:46:47.021552568 +0000 UTC m=+1074.319563481" Dec 06 15:46:47 crc kubenswrapper[4848]: I1206 15:46:47.152112 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:46:47 crc kubenswrapper[4848]: I1206 15:46:47.152171 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:46:47 crc kubenswrapper[4848]: I1206 15:46:47.656736 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 06 15:46:47 crc kubenswrapper[4848]: I1206 15:46:47.740267 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.093866 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-cj66w"] Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.136085 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lrwq2"] Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.138666 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.151553 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lrwq2"] Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.162104 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.176741 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.176899 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-config\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.176933 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjhh9\" (UniqueName: \"kubernetes.io/projected/47e6554b-9e4d-4d28-bd63-b379825e5396-kube-api-access-fjhh9\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.177004 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.177091 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.331023 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.331607 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.331673 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-config\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.331710 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjhh9\" (UniqueName: \"kubernetes.io/projected/47e6554b-9e4d-4d28-bd63-b379825e5396-kube-api-access-fjhh9\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.331803 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.332066 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.332646 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.332725 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-config\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.333428 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.365654 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjhh9\" (UniqueName: \"kubernetes.io/projected/47e6554b-9e4d-4d28-bd63-b379825e5396-kube-api-access-fjhh9\") pod \"dnsmasq-dns-b8fbc5445-lrwq2\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.462469 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.869455 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lrwq2"] Dec 06 15:46:48 crc kubenswrapper[4848]: I1206 15:46:48.996261 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" event={"ID":"47e6554b-9e4d-4d28-bd63-b379825e5396","Type":"ContainerStarted","Data":"487f81c86f593a3f4c50f2bf1a01352c4a88f5a9e3c252ff02d4d5f8cd430fe0"} Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.372061 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.377073 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.379413 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.379532 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.380293 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jj67r" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.380758 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.390654 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.451810 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.516404 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.548226 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-lock\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.548323 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x64w\" (UniqueName: \"kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-kube-api-access-6x64w\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.548364 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.548396 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.548474 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-cache\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.649712 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-lock\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.649790 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x64w\" (UniqueName: \"kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-kube-api-access-6x64w\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.649834 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.649859 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.649901 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-cache\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.650326 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-cache\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.650528 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-lock\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.651040 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: E1206 15:46:49.651916 4848 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 15:46:49 crc kubenswrapper[4848]: E1206 15:46:49.651949 4848 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 15:46:49 crc kubenswrapper[4848]: E1206 15:46:49.652010 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift podName:14e2fe95-2aba-441c-85e1-ebd9bc0ba12f nodeName:}" failed. No retries permitted until 2025-12-06 15:46:50.151990813 +0000 UTC m=+1077.450001726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift") pod "swift-storage-0" (UID: "14e2fe95-2aba-441c-85e1-ebd9bc0ba12f") : configmap "swift-ring-files" not found Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.670634 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x64w\" (UniqueName: \"kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-kube-api-access-6x64w\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:49 crc kubenswrapper[4848]: I1206 15:46:49.672663 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:50 crc kubenswrapper[4848]: I1206 15:46:50.004445 4848 generic.go:334] "Generic (PLEG): container finished" podID="47e6554b-9e4d-4d28-bd63-b379825e5396" containerID="adc4dcf0df03ca9a4dac44c771792a69b112d099eaea4de31242817128024969" exitCode=0 Dec 06 15:46:50 crc kubenswrapper[4848]: I1206 15:46:50.004874 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" event={"ID":"47e6554b-9e4d-4d28-bd63-b379825e5396","Type":"ContainerDied","Data":"adc4dcf0df03ca9a4dac44c771792a69b112d099eaea4de31242817128024969"} Dec 06 15:46:50 crc kubenswrapper[4848]: I1206 15:46:50.005358 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-cj66w" podUID="04b11548-5fac-4e9f-87ea-e9f0c8e4badc" containerName="dnsmasq-dns" containerID="cri-o://b25d26cf886f62adf48bc34fa8580376c1aa5610863ca66ae0971336ffc392c8" gracePeriod=10 Dec 06 15:46:50 crc kubenswrapper[4848]: I1206 15:46:50.160846 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:50 crc kubenswrapper[4848]: E1206 15:46:50.161101 4848 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 15:46:50 crc kubenswrapper[4848]: E1206 15:46:50.161142 4848 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 15:46:50 crc kubenswrapper[4848]: E1206 15:46:50.161217 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift podName:14e2fe95-2aba-441c-85e1-ebd9bc0ba12f nodeName:}" failed. No retries permitted until 2025-12-06 15:46:51.161187639 +0000 UTC m=+1078.459198602 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift") pod "swift-storage-0" (UID: "14e2fe95-2aba-441c-85e1-ebd9bc0ba12f") : configmap "swift-ring-files" not found Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.013844 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" event={"ID":"47e6554b-9e4d-4d28-bd63-b379825e5396","Type":"ContainerStarted","Data":"d620577155e2eb518cc205aba4690ce802be10bb6fa779aa9ccb985ea7767f34"} Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.014253 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.016025 4848 generic.go:334] "Generic (PLEG): container finished" podID="04b11548-5fac-4e9f-87ea-e9f0c8e4badc" containerID="b25d26cf886f62adf48bc34fa8580376c1aa5610863ca66ae0971336ffc392c8" exitCode=0 Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.016064 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-cj66w" event={"ID":"04b11548-5fac-4e9f-87ea-e9f0c8e4badc","Type":"ContainerDied","Data":"b25d26cf886f62adf48bc34fa8580376c1aa5610863ca66ae0971336ffc392c8"} Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.040118 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" podStartSLOduration=3.040101415 podStartE2EDuration="3.040101415s" podCreationTimestamp="2025-12-06 15:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:46:51.034354609 +0000 UTC m=+1078.332365522" watchObservedRunningTime="2025-12-06 15:46:51.040101415 +0000 UTC m=+1078.338112328" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.177974 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:51 crc kubenswrapper[4848]: E1206 15:46:51.178147 4848 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 15:46:51 crc kubenswrapper[4848]: E1206 15:46:51.178164 4848 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 15:46:51 crc kubenswrapper[4848]: E1206 15:46:51.178222 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift podName:14e2fe95-2aba-441c-85e1-ebd9bc0ba12f nodeName:}" failed. No retries permitted until 2025-12-06 15:46:53.178203247 +0000 UTC m=+1080.476214160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift") pod "swift-storage-0" (UID: "14e2fe95-2aba-441c-85e1-ebd9bc0ba12f") : configmap "swift-ring-files" not found Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.547242 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.686001 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-nb\") pod \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.686070 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-config\") pod \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.686088 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-sb\") pod \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.686201 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-dns-svc\") pod \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.686281 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8tng\" (UniqueName: \"kubernetes.io/projected/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-kube-api-access-z8tng\") pod \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\" (UID: \"04b11548-5fac-4e9f-87ea-e9f0c8e4badc\") " Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.691607 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-kube-api-access-z8tng" (OuterVolumeSpecName: "kube-api-access-z8tng") pod "04b11548-5fac-4e9f-87ea-e9f0c8e4badc" (UID: "04b11548-5fac-4e9f-87ea-e9f0c8e4badc"). InnerVolumeSpecName "kube-api-access-z8tng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.726027 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04b11548-5fac-4e9f-87ea-e9f0c8e4badc" (UID: "04b11548-5fac-4e9f-87ea-e9f0c8e4badc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.728160 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04b11548-5fac-4e9f-87ea-e9f0c8e4badc" (UID: "04b11548-5fac-4e9f-87ea-e9f0c8e4badc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.731178 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-config" (OuterVolumeSpecName: "config") pod "04b11548-5fac-4e9f-87ea-e9f0c8e4badc" (UID: "04b11548-5fac-4e9f-87ea-e9f0c8e4badc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.733373 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04b11548-5fac-4e9f-87ea-e9f0c8e4badc" (UID: "04b11548-5fac-4e9f-87ea-e9f0c8e4badc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.788122 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.788158 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8tng\" (UniqueName: \"kubernetes.io/projected/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-kube-api-access-z8tng\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.788170 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.788180 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:51 crc kubenswrapper[4848]: I1206 15:46:51.788189 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b11548-5fac-4e9f-87ea-e9f0c8e4badc-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:52 crc kubenswrapper[4848]: I1206 15:46:52.024651 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-cj66w" Dec 06 15:46:52 crc kubenswrapper[4848]: I1206 15:46:52.025015 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-cj66w" event={"ID":"04b11548-5fac-4e9f-87ea-e9f0c8e4badc","Type":"ContainerDied","Data":"b48316f11b86eb0612bf24cdb2251ceba6b2e43043c904155de875f726779099"} Dec 06 15:46:52 crc kubenswrapper[4848]: I1206 15:46:52.025053 4848 scope.go:117] "RemoveContainer" containerID="b25d26cf886f62adf48bc34fa8580376c1aa5610863ca66ae0971336ffc392c8" Dec 06 15:46:52 crc kubenswrapper[4848]: I1206 15:46:52.044527 4848 scope.go:117] "RemoveContainer" containerID="e4f35468458de2019ddb3bcc154c2f36c862c5bbf5329385c0353d37ad84476f" Dec 06 15:46:52 crc kubenswrapper[4848]: I1206 15:46:52.057644 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-cj66w"] Dec 06 15:46:52 crc kubenswrapper[4848]: I1206 15:46:52.063188 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-cj66w"] Dec 06 15:46:52 crc kubenswrapper[4848]: I1206 15:46:52.977647 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b11548-5fac-4e9f-87ea-e9f0c8e4badc" path="/var/lib/kubelet/pods/04b11548-5fac-4e9f-87ea-e9f0c8e4badc/volumes" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.212529 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:53 crc kubenswrapper[4848]: E1206 15:46:53.213118 4848 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 15:46:53 crc kubenswrapper[4848]: E1206 15:46:53.213146 4848 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 15:46:53 crc kubenswrapper[4848]: E1206 15:46:53.213229 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift podName:14e2fe95-2aba-441c-85e1-ebd9bc0ba12f nodeName:}" failed. No retries permitted until 2025-12-06 15:46:57.213204987 +0000 UTC m=+1084.511215930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift") pod "swift-storage-0" (UID: "14e2fe95-2aba-441c-85e1-ebd9bc0ba12f") : configmap "swift-ring-files" not found Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.229405 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-99ss6"] Dec 06 15:46:53 crc kubenswrapper[4848]: E1206 15:46:53.233201 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b11548-5fac-4e9f-87ea-e9f0c8e4badc" containerName="init" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.233308 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b11548-5fac-4e9f-87ea-e9f0c8e4badc" containerName="init" Dec 06 15:46:53 crc kubenswrapper[4848]: E1206 15:46:53.233394 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b11548-5fac-4e9f-87ea-e9f0c8e4badc" containerName="dnsmasq-dns" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.233405 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b11548-5fac-4e9f-87ea-e9f0c8e4badc" containerName="dnsmasq-dns" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.234802 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b11548-5fac-4e9f-87ea-e9f0c8e4badc" containerName="dnsmasq-dns" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.236893 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.240156 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.241004 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.241285 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.246388 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-99ss6"] Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.314858 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-dispersionconf\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.314897 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-scripts\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.314917 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-combined-ca-bundle\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.314945 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-swiftconf\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.314972 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-ring-data-devices\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.315018 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-etc-swift\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.315171 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pglzl\" (UniqueName: \"kubernetes.io/projected/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-kube-api-access-pglzl\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.417045 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-etc-swift\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.417128 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pglzl\" (UniqueName: \"kubernetes.io/projected/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-kube-api-access-pglzl\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.417234 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-dispersionconf\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.417260 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-scripts\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.417283 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-combined-ca-bundle\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.417310 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-swiftconf\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.417348 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-ring-data-devices\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.417536 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-etc-swift\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.418264 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-ring-data-devices\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.418356 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-scripts\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.422556 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-dispersionconf\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.422844 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-swiftconf\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.423093 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-combined-ca-bundle\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.434119 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pglzl\" (UniqueName: \"kubernetes.io/projected/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-kube-api-access-pglzl\") pod \"swift-ring-rebalance-99ss6\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.570653 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jj67r" Dec 06 15:46:53 crc kubenswrapper[4848]: I1206 15:46:53.579221 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:46:54 crc kubenswrapper[4848]: I1206 15:46:54.011028 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-99ss6"] Dec 06 15:46:54 crc kubenswrapper[4848]: I1206 15:46:54.038396 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-99ss6" event={"ID":"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d","Type":"ContainerStarted","Data":"1cc06108f73aa63f5e1a9ba1b1516156923ffddea90b9672394cd90ab8a96b1e"} Dec 06 15:46:55 crc kubenswrapper[4848]: I1206 15:46:55.817741 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-82ca-account-create-update-4szsh"] Dec 06 15:46:55 crc kubenswrapper[4848]: I1206 15:46:55.819825 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-82ca-account-create-update-4szsh" Dec 06 15:46:55 crc kubenswrapper[4848]: I1206 15:46:55.821859 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 06 15:46:55 crc kubenswrapper[4848]: I1206 15:46:55.824531 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-82ca-account-create-update-4szsh"] Dec 06 15:46:55 crc kubenswrapper[4848]: I1206 15:46:55.862567 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-7488f"] Dec 06 15:46:55 crc kubenswrapper[4848]: I1206 15:46:55.863798 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7488f" Dec 06 15:46:55 crc kubenswrapper[4848]: I1206 15:46:55.869383 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7488f"] Dec 06 15:46:55 crc kubenswrapper[4848]: I1206 15:46:55.957452 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2bccf8-b293-40e5-a41b-78ec5826fc22-operator-scripts\") pod \"keystone-82ca-account-create-update-4szsh\" (UID: \"7e2bccf8-b293-40e5-a41b-78ec5826fc22\") " pod="openstack/keystone-82ca-account-create-update-4szsh" Dec 06 15:46:55 crc kubenswrapper[4848]: I1206 15:46:55.957497 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwmgx\" (UniqueName: \"kubernetes.io/projected/f9c835a3-f7fd-439d-8839-f2ebc7923899-kube-api-access-qwmgx\") pod \"keystone-db-create-7488f\" (UID: \"f9c835a3-f7fd-439d-8839-f2ebc7923899\") " pod="openstack/keystone-db-create-7488f" Dec 06 15:46:55 crc kubenswrapper[4848]: I1206 15:46:55.957533 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9c835a3-f7fd-439d-8839-f2ebc7923899-operator-scripts\") pod \"keystone-db-create-7488f\" (UID: \"f9c835a3-f7fd-439d-8839-f2ebc7923899\") " pod="openstack/keystone-db-create-7488f" Dec 06 15:46:55 crc kubenswrapper[4848]: I1206 15:46:55.957594 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjlp\" (UniqueName: \"kubernetes.io/projected/7e2bccf8-b293-40e5-a41b-78ec5826fc22-kube-api-access-tsjlp\") pod \"keystone-82ca-account-create-update-4szsh\" (UID: \"7e2bccf8-b293-40e5-a41b-78ec5826fc22\") " pod="openstack/keystone-82ca-account-create-update-4szsh" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.059441 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9c835a3-f7fd-439d-8839-f2ebc7923899-operator-scripts\") pod \"keystone-db-create-7488f\" (UID: \"f9c835a3-f7fd-439d-8839-f2ebc7923899\") " pod="openstack/keystone-db-create-7488f" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.059852 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsjlp\" (UniqueName: \"kubernetes.io/projected/7e2bccf8-b293-40e5-a41b-78ec5826fc22-kube-api-access-tsjlp\") pod \"keystone-82ca-account-create-update-4szsh\" (UID: \"7e2bccf8-b293-40e5-a41b-78ec5826fc22\") " pod="openstack/keystone-82ca-account-create-update-4szsh" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.059966 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2bccf8-b293-40e5-a41b-78ec5826fc22-operator-scripts\") pod \"keystone-82ca-account-create-update-4szsh\" (UID: \"7e2bccf8-b293-40e5-a41b-78ec5826fc22\") " pod="openstack/keystone-82ca-account-create-update-4szsh" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.059999 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwmgx\" (UniqueName: \"kubernetes.io/projected/f9c835a3-f7fd-439d-8839-f2ebc7923899-kube-api-access-qwmgx\") pod \"keystone-db-create-7488f\" (UID: \"f9c835a3-f7fd-439d-8839-f2ebc7923899\") " pod="openstack/keystone-db-create-7488f" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.060417 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9c835a3-f7fd-439d-8839-f2ebc7923899-operator-scripts\") pod \"keystone-db-create-7488f\" (UID: \"f9c835a3-f7fd-439d-8839-f2ebc7923899\") " pod="openstack/keystone-db-create-7488f" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.060832 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2bccf8-b293-40e5-a41b-78ec5826fc22-operator-scripts\") pod \"keystone-82ca-account-create-update-4szsh\" (UID: \"7e2bccf8-b293-40e5-a41b-78ec5826fc22\") " pod="openstack/keystone-82ca-account-create-update-4szsh" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.090276 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsjlp\" (UniqueName: \"kubernetes.io/projected/7e2bccf8-b293-40e5-a41b-78ec5826fc22-kube-api-access-tsjlp\") pod \"keystone-82ca-account-create-update-4szsh\" (UID: \"7e2bccf8-b293-40e5-a41b-78ec5826fc22\") " pod="openstack/keystone-82ca-account-create-update-4szsh" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.091978 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwmgx\" (UniqueName: \"kubernetes.io/projected/f9c835a3-f7fd-439d-8839-f2ebc7923899-kube-api-access-qwmgx\") pod \"keystone-db-create-7488f\" (UID: \"f9c835a3-f7fd-439d-8839-f2ebc7923899\") " pod="openstack/keystone-db-create-7488f" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.097450 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-m2c2j"] Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.098833 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m2c2j" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.108155 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m2c2j"] Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.137843 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-82ca-account-create-update-4szsh" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.149749 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1ca1-account-create-update-s49tr"] Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.150977 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1ca1-account-create-update-s49tr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.153253 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.155734 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1ca1-account-create-update-s49tr"] Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.161228 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5534716d-e2f0-4e26-825c-8468b6832159-operator-scripts\") pod \"placement-db-create-m2c2j\" (UID: \"5534716d-e2f0-4e26-825c-8468b6832159\") " pod="openstack/placement-db-create-m2c2j" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.161392 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm27k\" (UniqueName: \"kubernetes.io/projected/5534716d-e2f0-4e26-825c-8468b6832159-kube-api-access-xm27k\") pod \"placement-db-create-m2c2j\" (UID: \"5534716d-e2f0-4e26-825c-8468b6832159\") " pod="openstack/placement-db-create-m2c2j" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.180730 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7488f" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.262780 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5534716d-e2f0-4e26-825c-8468b6832159-operator-scripts\") pod \"placement-db-create-m2c2j\" (UID: \"5534716d-e2f0-4e26-825c-8468b6832159\") " pod="openstack/placement-db-create-m2c2j" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.263183 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa122601-0140-4956-9ff0-2ad0e05ea423-operator-scripts\") pod \"placement-1ca1-account-create-update-s49tr\" (UID: \"fa122601-0140-4956-9ff0-2ad0e05ea423\") " pod="openstack/placement-1ca1-account-create-update-s49tr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.263213 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x795\" (UniqueName: \"kubernetes.io/projected/fa122601-0140-4956-9ff0-2ad0e05ea423-kube-api-access-2x795\") pod \"placement-1ca1-account-create-update-s49tr\" (UID: \"fa122601-0140-4956-9ff0-2ad0e05ea423\") " pod="openstack/placement-1ca1-account-create-update-s49tr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.263241 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm27k\" (UniqueName: \"kubernetes.io/projected/5534716d-e2f0-4e26-825c-8468b6832159-kube-api-access-xm27k\") pod \"placement-db-create-m2c2j\" (UID: \"5534716d-e2f0-4e26-825c-8468b6832159\") " pod="openstack/placement-db-create-m2c2j" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.263658 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5534716d-e2f0-4e26-825c-8468b6832159-operator-scripts\") pod \"placement-db-create-m2c2j\" (UID: \"5534716d-e2f0-4e26-825c-8468b6832159\") " pod="openstack/placement-db-create-m2c2j" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.283583 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm27k\" (UniqueName: \"kubernetes.io/projected/5534716d-e2f0-4e26-825c-8468b6832159-kube-api-access-xm27k\") pod \"placement-db-create-m2c2j\" (UID: \"5534716d-e2f0-4e26-825c-8468b6832159\") " pod="openstack/placement-db-create-m2c2j" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.366056 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa122601-0140-4956-9ff0-2ad0e05ea423-operator-scripts\") pod \"placement-1ca1-account-create-update-s49tr\" (UID: \"fa122601-0140-4956-9ff0-2ad0e05ea423\") " pod="openstack/placement-1ca1-account-create-update-s49tr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.366123 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x795\" (UniqueName: \"kubernetes.io/projected/fa122601-0140-4956-9ff0-2ad0e05ea423-kube-api-access-2x795\") pod \"placement-1ca1-account-create-update-s49tr\" (UID: \"fa122601-0140-4956-9ff0-2ad0e05ea423\") " pod="openstack/placement-1ca1-account-create-update-s49tr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.367060 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa122601-0140-4956-9ff0-2ad0e05ea423-operator-scripts\") pod \"placement-1ca1-account-create-update-s49tr\" (UID: \"fa122601-0140-4956-9ff0-2ad0e05ea423\") " pod="openstack/placement-1ca1-account-create-update-s49tr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.386886 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x795\" (UniqueName: \"kubernetes.io/projected/fa122601-0140-4956-9ff0-2ad0e05ea423-kube-api-access-2x795\") pod \"placement-1ca1-account-create-update-s49tr\" (UID: \"fa122601-0140-4956-9ff0-2ad0e05ea423\") " pod="openstack/placement-1ca1-account-create-update-s49tr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.419139 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-s8cj2"] Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.420164 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s8cj2" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.432265 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-s8cj2"] Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.468230 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r74hw\" (UniqueName: \"kubernetes.io/projected/e395dab1-bcea-4e77-8de4-cbd303e30216-kube-api-access-r74hw\") pod \"glance-db-create-s8cj2\" (UID: \"e395dab1-bcea-4e77-8de4-cbd303e30216\") " pod="openstack/glance-db-create-s8cj2" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.468295 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e395dab1-bcea-4e77-8de4-cbd303e30216-operator-scripts\") pod \"glance-db-create-s8cj2\" (UID: \"e395dab1-bcea-4e77-8de4-cbd303e30216\") " pod="openstack/glance-db-create-s8cj2" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.484146 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m2c2j" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.492265 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1ca1-account-create-update-s49tr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.519592 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1198-account-create-update-bbblr"] Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.520769 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1198-account-create-update-bbblr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.523169 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.544136 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1198-account-create-update-bbblr"] Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.569884 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r74hw\" (UniqueName: \"kubernetes.io/projected/e395dab1-bcea-4e77-8de4-cbd303e30216-kube-api-access-r74hw\") pod \"glance-db-create-s8cj2\" (UID: \"e395dab1-bcea-4e77-8de4-cbd303e30216\") " pod="openstack/glance-db-create-s8cj2" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.570140 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e395dab1-bcea-4e77-8de4-cbd303e30216-operator-scripts\") pod \"glance-db-create-s8cj2\" (UID: \"e395dab1-bcea-4e77-8de4-cbd303e30216\") " pod="openstack/glance-db-create-s8cj2" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.570297 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3934c7ad-2ee0-4ada-883f-3fceb84c8195-operator-scripts\") pod \"glance-1198-account-create-update-bbblr\" (UID: \"3934c7ad-2ee0-4ada-883f-3fceb84c8195\") " pod="openstack/glance-1198-account-create-update-bbblr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.570393 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkmfx\" (UniqueName: \"kubernetes.io/projected/3934c7ad-2ee0-4ada-883f-3fceb84c8195-kube-api-access-pkmfx\") pod \"glance-1198-account-create-update-bbblr\" (UID: \"3934c7ad-2ee0-4ada-883f-3fceb84c8195\") " pod="openstack/glance-1198-account-create-update-bbblr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.570824 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e395dab1-bcea-4e77-8de4-cbd303e30216-operator-scripts\") pod \"glance-db-create-s8cj2\" (UID: \"e395dab1-bcea-4e77-8de4-cbd303e30216\") " pod="openstack/glance-db-create-s8cj2" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.595732 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r74hw\" (UniqueName: \"kubernetes.io/projected/e395dab1-bcea-4e77-8de4-cbd303e30216-kube-api-access-r74hw\") pod \"glance-db-create-s8cj2\" (UID: \"e395dab1-bcea-4e77-8de4-cbd303e30216\") " pod="openstack/glance-db-create-s8cj2" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.671926 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3934c7ad-2ee0-4ada-883f-3fceb84c8195-operator-scripts\") pod \"glance-1198-account-create-update-bbblr\" (UID: \"3934c7ad-2ee0-4ada-883f-3fceb84c8195\") " pod="openstack/glance-1198-account-create-update-bbblr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.672014 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkmfx\" (UniqueName: \"kubernetes.io/projected/3934c7ad-2ee0-4ada-883f-3fceb84c8195-kube-api-access-pkmfx\") pod \"glance-1198-account-create-update-bbblr\" (UID: \"3934c7ad-2ee0-4ada-883f-3fceb84c8195\") " pod="openstack/glance-1198-account-create-update-bbblr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.673397 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3934c7ad-2ee0-4ada-883f-3fceb84c8195-operator-scripts\") pod \"glance-1198-account-create-update-bbblr\" (UID: \"3934c7ad-2ee0-4ada-883f-3fceb84c8195\") " pod="openstack/glance-1198-account-create-update-bbblr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.688969 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkmfx\" (UniqueName: \"kubernetes.io/projected/3934c7ad-2ee0-4ada-883f-3fceb84c8195-kube-api-access-pkmfx\") pod \"glance-1198-account-create-update-bbblr\" (UID: \"3934c7ad-2ee0-4ada-883f-3fceb84c8195\") " pod="openstack/glance-1198-account-create-update-bbblr" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.740948 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s8cj2" Dec 06 15:46:56 crc kubenswrapper[4848]: I1206 15:46:56.837299 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1198-account-create-update-bbblr" Dec 06 15:46:57 crc kubenswrapper[4848]: I1206 15:46:57.281823 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:46:57 crc kubenswrapper[4848]: E1206 15:46:57.282232 4848 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 15:46:57 crc kubenswrapper[4848]: E1206 15:46:57.282247 4848 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 15:46:57 crc kubenswrapper[4848]: E1206 15:46:57.282291 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift podName:14e2fe95-2aba-441c-85e1-ebd9bc0ba12f nodeName:}" failed. No retries permitted until 2025-12-06 15:47:05.282277483 +0000 UTC m=+1092.580288396 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift") pod "swift-storage-0" (UID: "14e2fe95-2aba-441c-85e1-ebd9bc0ba12f") : configmap "swift-ring-files" not found Dec 06 15:46:57 crc kubenswrapper[4848]: I1206 15:46:57.707761 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-s8cj2"] Dec 06 15:46:57 crc kubenswrapper[4848]: I1206 15:46:57.718793 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7488f"] Dec 06 15:46:57 crc kubenswrapper[4848]: I1206 15:46:57.725506 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1ca1-account-create-update-s49tr"] Dec 06 15:46:57 crc kubenswrapper[4848]: I1206 15:46:57.845384 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m2c2j"] Dec 06 15:46:57 crc kubenswrapper[4848]: W1206 15:46:57.859003 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5534716d_e2f0_4e26_825c_8468b6832159.slice/crio-c677b926bfad19b5fc9ecbd75b0e55deef90a43f6ea57c6677f09c282d5976aa WatchSource:0}: Error finding container c677b926bfad19b5fc9ecbd75b0e55deef90a43f6ea57c6677f09c282d5976aa: Status 404 returned error can't find the container with id c677b926bfad19b5fc9ecbd75b0e55deef90a43f6ea57c6677f09c282d5976aa Dec 06 15:46:57 crc kubenswrapper[4848]: I1206 15:46:57.916820 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1198-account-create-update-bbblr"] Dec 06 15:46:57 crc kubenswrapper[4848]: I1206 15:46:57.928737 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-82ca-account-create-update-4szsh"] Dec 06 15:46:57 crc kubenswrapper[4848]: W1206 15:46:57.932538 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3934c7ad_2ee0_4ada_883f_3fceb84c8195.slice/crio-217d5cd54f96158479b8e6680402c0197d922187e9ea65198a12017006eaff91 WatchSource:0}: Error finding container 217d5cd54f96158479b8e6680402c0197d922187e9ea65198a12017006eaff91: Status 404 returned error can't find the container with id 217d5cd54f96158479b8e6680402c0197d922187e9ea65198a12017006eaff91 Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.075906 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m2c2j" event={"ID":"5534716d-e2f0-4e26-825c-8468b6832159","Type":"ContainerStarted","Data":"927a8c35447afa89cb0ac1a704737b3ed8d6c4a4dd0da516239a3a516f50fc6d"} Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.075948 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m2c2j" event={"ID":"5534716d-e2f0-4e26-825c-8468b6832159","Type":"ContainerStarted","Data":"c677b926bfad19b5fc9ecbd75b0e55deef90a43f6ea57c6677f09c282d5976aa"} Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.077161 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1ca1-account-create-update-s49tr" event={"ID":"fa122601-0140-4956-9ff0-2ad0e05ea423","Type":"ContainerStarted","Data":"67dec1c86133e33f09aacad77b09e2e65dbd99afe333541242906ec4bad2ee44"} Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.077185 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1ca1-account-create-update-s49tr" event={"ID":"fa122601-0140-4956-9ff0-2ad0e05ea423","Type":"ContainerStarted","Data":"9ca2bfd66a63630a6384e1bbf6db10030f3d6aeb75429c365d3ca4950e8f7d42"} Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.078286 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1198-account-create-update-bbblr" event={"ID":"3934c7ad-2ee0-4ada-883f-3fceb84c8195","Type":"ContainerStarted","Data":"217d5cd54f96158479b8e6680402c0197d922187e9ea65198a12017006eaff91"} Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.078976 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-82ca-account-create-update-4szsh" event={"ID":"7e2bccf8-b293-40e5-a41b-78ec5826fc22","Type":"ContainerStarted","Data":"9b2b060c03f0dc2a3eacc633985f2dd951f2fe0e5fa7a3396ac70ae5dd6fb37d"} Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.080095 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-99ss6" event={"ID":"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d","Type":"ContainerStarted","Data":"6d9cb4ec92d8d5fbdac9208298cb3991872399d8c5015c54e21da4d29217929f"} Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.081620 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s8cj2" event={"ID":"e395dab1-bcea-4e77-8de4-cbd303e30216","Type":"ContainerStarted","Data":"54ae842e030a67d597612828948dd8f3ca668a5a9df7064fa50b634643ddf678"} Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.081652 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s8cj2" event={"ID":"e395dab1-bcea-4e77-8de4-cbd303e30216","Type":"ContainerStarted","Data":"42348f63d6981d7efe35590a344a1e4f7489b2f8f750b6b9c2397ca6c16b50a8"} Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.082902 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7488f" event={"ID":"f9c835a3-f7fd-439d-8839-f2ebc7923899","Type":"ContainerStarted","Data":"c7d3357e58d3f88dc9c3baf4c09ecd630ddf3325b0938d3885a84bbb7377b05c"} Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.082950 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7488f" event={"ID":"f9c835a3-f7fd-439d-8839-f2ebc7923899","Type":"ContainerStarted","Data":"c7cdc79a061bb526c24a31ae8995d2b76e9676fa9c880c841c2c34e071d4d8b1"} Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.129621 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-m2c2j" podStartSLOduration=2.129600322 podStartE2EDuration="2.129600322s" podCreationTimestamp="2025-12-06 15:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:46:58.092748534 +0000 UTC m=+1085.390759457" watchObservedRunningTime="2025-12-06 15:46:58.129600322 +0000 UTC m=+1085.427611235" Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.144109 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-1ca1-account-create-update-s49tr" podStartSLOduration=2.144086555 podStartE2EDuration="2.144086555s" podCreationTimestamp="2025-12-06 15:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:46:58.115369477 +0000 UTC m=+1085.413380390" watchObservedRunningTime="2025-12-06 15:46:58.144086555 +0000 UTC m=+1085.442097478" Dec 06 15:46:58 crc kubenswrapper[4848]: E1206 15:46:58.162103 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9c835a3_f7fd_439d_8839_f2ebc7923899.slice/crio-c7d3357e58d3f88dc9c3baf4c09ecd630ddf3325b0938d3885a84bbb7377b05c.scope\": RecentStats: unable to find data in memory cache]" Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.167412 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-99ss6" podStartSLOduration=1.970640097 podStartE2EDuration="5.166654827s" podCreationTimestamp="2025-12-06 15:46:53 +0000 UTC" firstStartedPulling="2025-12-06 15:46:54.014636783 +0000 UTC m=+1081.312647696" lastFinishedPulling="2025-12-06 15:46:57.210651513 +0000 UTC m=+1084.508662426" observedRunningTime="2025-12-06 15:46:58.140906078 +0000 UTC m=+1085.438917001" watchObservedRunningTime="2025-12-06 15:46:58.166654827 +0000 UTC m=+1085.464665740" Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.176591 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-7488f" podStartSLOduration=3.176556905 podStartE2EDuration="3.176556905s" podCreationTimestamp="2025-12-06 15:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:46:58.158411963 +0000 UTC m=+1085.456422886" watchObservedRunningTime="2025-12-06 15:46:58.176556905 +0000 UTC m=+1085.474567818" Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.186056 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-s8cj2" podStartSLOduration=2.186031152 podStartE2EDuration="2.186031152s" podCreationTimestamp="2025-12-06 15:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:46:58.168806304 +0000 UTC m=+1085.466817207" watchObservedRunningTime="2025-12-06 15:46:58.186031152 +0000 UTC m=+1085.484042065" Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.463903 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.548157 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l66z5"] Dec 06 15:46:58 crc kubenswrapper[4848]: I1206 15:46:58.548381 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" podUID="a8715bfa-e1a1-4467-8543-6807e7facc8e" containerName="dnsmasq-dns" containerID="cri-o://de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808" gracePeriod=10 Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.067646 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.095144 4848 generic.go:334] "Generic (PLEG): container finished" podID="f9c835a3-f7fd-439d-8839-f2ebc7923899" containerID="c7d3357e58d3f88dc9c3baf4c09ecd630ddf3325b0938d3885a84bbb7377b05c" exitCode=0 Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.095213 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7488f" event={"ID":"f9c835a3-f7fd-439d-8839-f2ebc7923899","Type":"ContainerDied","Data":"c7d3357e58d3f88dc9c3baf4c09ecd630ddf3325b0938d3885a84bbb7377b05c"} Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.096625 4848 generic.go:334] "Generic (PLEG): container finished" podID="5534716d-e2f0-4e26-825c-8468b6832159" containerID="927a8c35447afa89cb0ac1a704737b3ed8d6c4a4dd0da516239a3a516f50fc6d" exitCode=0 Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.096658 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m2c2j" event={"ID":"5534716d-e2f0-4e26-825c-8468b6832159","Type":"ContainerDied","Data":"927a8c35447afa89cb0ac1a704737b3ed8d6c4a4dd0da516239a3a516f50fc6d"} Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.097882 4848 generic.go:334] "Generic (PLEG): container finished" podID="7e2bccf8-b293-40e5-a41b-78ec5826fc22" containerID="3dfd060ec291ddb4e4c193a2c4938ce1b8bcb920db2bbfd4871c7cf5244c621d" exitCode=0 Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.097923 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-82ca-account-create-update-4szsh" event={"ID":"7e2bccf8-b293-40e5-a41b-78ec5826fc22","Type":"ContainerDied","Data":"3dfd060ec291ddb4e4c193a2c4938ce1b8bcb920db2bbfd4871c7cf5244c621d"} Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.099038 4848 generic.go:334] "Generic (PLEG): container finished" podID="fa122601-0140-4956-9ff0-2ad0e05ea423" containerID="67dec1c86133e33f09aacad77b09e2e65dbd99afe333541242906ec4bad2ee44" exitCode=0 Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.099074 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1ca1-account-create-update-s49tr" event={"ID":"fa122601-0140-4956-9ff0-2ad0e05ea423","Type":"ContainerDied","Data":"67dec1c86133e33f09aacad77b09e2e65dbd99afe333541242906ec4bad2ee44"} Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.100415 4848 generic.go:334] "Generic (PLEG): container finished" podID="3934c7ad-2ee0-4ada-883f-3fceb84c8195" containerID="24779547ebbefe88f1fc20baa7693447c72cb98f08c1c558b418cf1b3a3dbe0f" exitCode=0 Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.100454 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1198-account-create-update-bbblr" event={"ID":"3934c7ad-2ee0-4ada-883f-3fceb84c8195","Type":"ContainerDied","Data":"24779547ebbefe88f1fc20baa7693447c72cb98f08c1c558b418cf1b3a3dbe0f"} Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.102296 4848 generic.go:334] "Generic (PLEG): container finished" podID="a8715bfa-e1a1-4467-8543-6807e7facc8e" containerID="de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808" exitCode=0 Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.102336 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" event={"ID":"a8715bfa-e1a1-4467-8543-6807e7facc8e","Type":"ContainerDied","Data":"de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808"} Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.102354 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" event={"ID":"a8715bfa-e1a1-4467-8543-6807e7facc8e","Type":"ContainerDied","Data":"a3f6109fe48f288af7a0f3edcdf294c9c799dfd783eea73bc5c476a49be590a1"} Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.102370 4848 scope.go:117] "RemoveContainer" containerID="de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.102467 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l66z5" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.104636 4848 generic.go:334] "Generic (PLEG): container finished" podID="e395dab1-bcea-4e77-8de4-cbd303e30216" containerID="54ae842e030a67d597612828948dd8f3ca668a5a9df7064fa50b634643ddf678" exitCode=0 Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.104817 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s8cj2" event={"ID":"e395dab1-bcea-4e77-8de4-cbd303e30216","Type":"ContainerDied","Data":"54ae842e030a67d597612828948dd8f3ca668a5a9df7064fa50b634643ddf678"} Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.120555 4848 scope.go:117] "RemoveContainer" containerID="fa6230deb6478b4de3cb2ee468325bb402cee65204f79bed702f2bb65a86e017" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.122008 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-dns-svc\") pod \"a8715bfa-e1a1-4467-8543-6807e7facc8e\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.122083 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-config\") pod \"a8715bfa-e1a1-4467-8543-6807e7facc8e\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.122104 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrzpq\" (UniqueName: \"kubernetes.io/projected/a8715bfa-e1a1-4467-8543-6807e7facc8e-kube-api-access-nrzpq\") pod \"a8715bfa-e1a1-4467-8543-6807e7facc8e\" (UID: \"a8715bfa-e1a1-4467-8543-6807e7facc8e\") " Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.127805 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8715bfa-e1a1-4467-8543-6807e7facc8e-kube-api-access-nrzpq" (OuterVolumeSpecName: "kube-api-access-nrzpq") pod "a8715bfa-e1a1-4467-8543-6807e7facc8e" (UID: "a8715bfa-e1a1-4467-8543-6807e7facc8e"). InnerVolumeSpecName "kube-api-access-nrzpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.167831 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8715bfa-e1a1-4467-8543-6807e7facc8e" (UID: "a8715bfa-e1a1-4467-8543-6807e7facc8e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.179252 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-config" (OuterVolumeSpecName: "config") pod "a8715bfa-e1a1-4467-8543-6807e7facc8e" (UID: "a8715bfa-e1a1-4467-8543-6807e7facc8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.224176 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.224209 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrzpq\" (UniqueName: \"kubernetes.io/projected/a8715bfa-e1a1-4467-8543-6807e7facc8e-kube-api-access-nrzpq\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.224221 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8715bfa-e1a1-4467-8543-6807e7facc8e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.250089 4848 scope.go:117] "RemoveContainer" containerID="de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808" Dec 06 15:46:59 crc kubenswrapper[4848]: E1206 15:46:59.250558 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808\": container with ID starting with de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808 not found: ID does not exist" containerID="de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.250591 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808"} err="failed to get container status \"de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808\": rpc error: code = NotFound desc = could not find container \"de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808\": container with ID starting with de987f1334f004f83718b7fa9d7513eedd35809775e27125a4b18347439a9808 not found: ID does not exist" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.250616 4848 scope.go:117] "RemoveContainer" containerID="fa6230deb6478b4de3cb2ee468325bb402cee65204f79bed702f2bb65a86e017" Dec 06 15:46:59 crc kubenswrapper[4848]: E1206 15:46:59.250983 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa6230deb6478b4de3cb2ee468325bb402cee65204f79bed702f2bb65a86e017\": container with ID starting with fa6230deb6478b4de3cb2ee468325bb402cee65204f79bed702f2bb65a86e017 not found: ID does not exist" containerID="fa6230deb6478b4de3cb2ee468325bb402cee65204f79bed702f2bb65a86e017" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.251010 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6230deb6478b4de3cb2ee468325bb402cee65204f79bed702f2bb65a86e017"} err="failed to get container status \"fa6230deb6478b4de3cb2ee468325bb402cee65204f79bed702f2bb65a86e017\": rpc error: code = NotFound desc = could not find container \"fa6230deb6478b4de3cb2ee468325bb402cee65204f79bed702f2bb65a86e017\": container with ID starting with fa6230deb6478b4de3cb2ee468325bb402cee65204f79bed702f2bb65a86e017 not found: ID does not exist" Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.467538 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l66z5"] Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.474465 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l66z5"] Dec 06 15:46:59 crc kubenswrapper[4848]: I1206 15:46:59.755942 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.115521 4848 generic.go:334] "Generic (PLEG): container finished" podID="dda76265-1c2c-4409-8460-99bc3ab509c6" containerID="2d8388c8c09ab32d2f3f332f8e780dac4da735bd62c80866b31cece9902797ce" exitCode=0 Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.115587 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dda76265-1c2c-4409-8460-99bc3ab509c6","Type":"ContainerDied","Data":"2d8388c8c09ab32d2f3f332f8e780dac4da735bd62c80866b31cece9902797ce"} Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.121595 4848 generic.go:334] "Generic (PLEG): container finished" podID="b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" containerID="ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade" exitCode=0 Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.121749 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75","Type":"ContainerDied","Data":"ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade"} Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.501513 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s8cj2" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.552926 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e395dab1-bcea-4e77-8de4-cbd303e30216-operator-scripts\") pod \"e395dab1-bcea-4e77-8de4-cbd303e30216\" (UID: \"e395dab1-bcea-4e77-8de4-cbd303e30216\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.553084 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r74hw\" (UniqueName: \"kubernetes.io/projected/e395dab1-bcea-4e77-8de4-cbd303e30216-kube-api-access-r74hw\") pod \"e395dab1-bcea-4e77-8de4-cbd303e30216\" (UID: \"e395dab1-bcea-4e77-8de4-cbd303e30216\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.554971 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e395dab1-bcea-4e77-8de4-cbd303e30216-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e395dab1-bcea-4e77-8de4-cbd303e30216" (UID: "e395dab1-bcea-4e77-8de4-cbd303e30216"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.558588 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e395dab1-bcea-4e77-8de4-cbd303e30216-kube-api-access-r74hw" (OuterVolumeSpecName: "kube-api-access-r74hw") pod "e395dab1-bcea-4e77-8de4-cbd303e30216" (UID: "e395dab1-bcea-4e77-8de4-cbd303e30216"). InnerVolumeSpecName "kube-api-access-r74hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.637363 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7488f" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.654233 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m2c2j" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.654406 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwmgx\" (UniqueName: \"kubernetes.io/projected/f9c835a3-f7fd-439d-8839-f2ebc7923899-kube-api-access-qwmgx\") pod \"f9c835a3-f7fd-439d-8839-f2ebc7923899\" (UID: \"f9c835a3-f7fd-439d-8839-f2ebc7923899\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.654628 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9c835a3-f7fd-439d-8839-f2ebc7923899-operator-scripts\") pod \"f9c835a3-f7fd-439d-8839-f2ebc7923899\" (UID: \"f9c835a3-f7fd-439d-8839-f2ebc7923899\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.654994 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e395dab1-bcea-4e77-8de4-cbd303e30216-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.655011 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r74hw\" (UniqueName: \"kubernetes.io/projected/e395dab1-bcea-4e77-8de4-cbd303e30216-kube-api-access-r74hw\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.655239 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9c835a3-f7fd-439d-8839-f2ebc7923899-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9c835a3-f7fd-439d-8839-f2ebc7923899" (UID: "f9c835a3-f7fd-439d-8839-f2ebc7923899"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.658974 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c835a3-f7fd-439d-8839-f2ebc7923899-kube-api-access-qwmgx" (OuterVolumeSpecName: "kube-api-access-qwmgx") pod "f9c835a3-f7fd-439d-8839-f2ebc7923899" (UID: "f9c835a3-f7fd-439d-8839-f2ebc7923899"). InnerVolumeSpecName "kube-api-access-qwmgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.667051 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-82ca-account-create-update-4szsh" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.678277 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1ca1-account-create-update-s49tr" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.680881 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1198-account-create-update-bbblr" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756235 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5534716d-e2f0-4e26-825c-8468b6832159-operator-scripts\") pod \"5534716d-e2f0-4e26-825c-8468b6832159\" (UID: \"5534716d-e2f0-4e26-825c-8468b6832159\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756291 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm27k\" (UniqueName: \"kubernetes.io/projected/5534716d-e2f0-4e26-825c-8468b6832159-kube-api-access-xm27k\") pod \"5534716d-e2f0-4e26-825c-8468b6832159\" (UID: \"5534716d-e2f0-4e26-825c-8468b6832159\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756313 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2bccf8-b293-40e5-a41b-78ec5826fc22-operator-scripts\") pod \"7e2bccf8-b293-40e5-a41b-78ec5826fc22\" (UID: \"7e2bccf8-b293-40e5-a41b-78ec5826fc22\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756330 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkmfx\" (UniqueName: \"kubernetes.io/projected/3934c7ad-2ee0-4ada-883f-3fceb84c8195-kube-api-access-pkmfx\") pod \"3934c7ad-2ee0-4ada-883f-3fceb84c8195\" (UID: \"3934c7ad-2ee0-4ada-883f-3fceb84c8195\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756356 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa122601-0140-4956-9ff0-2ad0e05ea423-operator-scripts\") pod \"fa122601-0140-4956-9ff0-2ad0e05ea423\" (UID: \"fa122601-0140-4956-9ff0-2ad0e05ea423\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756400 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3934c7ad-2ee0-4ada-883f-3fceb84c8195-operator-scripts\") pod \"3934c7ad-2ee0-4ada-883f-3fceb84c8195\" (UID: \"3934c7ad-2ee0-4ada-883f-3fceb84c8195\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756445 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x795\" (UniqueName: \"kubernetes.io/projected/fa122601-0140-4956-9ff0-2ad0e05ea423-kube-api-access-2x795\") pod \"fa122601-0140-4956-9ff0-2ad0e05ea423\" (UID: \"fa122601-0140-4956-9ff0-2ad0e05ea423\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756465 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsjlp\" (UniqueName: \"kubernetes.io/projected/7e2bccf8-b293-40e5-a41b-78ec5826fc22-kube-api-access-tsjlp\") pod \"7e2bccf8-b293-40e5-a41b-78ec5826fc22\" (UID: \"7e2bccf8-b293-40e5-a41b-78ec5826fc22\") " Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756687 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2bccf8-b293-40e5-a41b-78ec5826fc22-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e2bccf8-b293-40e5-a41b-78ec5826fc22" (UID: "7e2bccf8-b293-40e5-a41b-78ec5826fc22"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756854 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwmgx\" (UniqueName: \"kubernetes.io/projected/f9c835a3-f7fd-439d-8839-f2ebc7923899-kube-api-access-qwmgx\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756871 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2bccf8-b293-40e5-a41b-78ec5826fc22-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.756882 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9c835a3-f7fd-439d-8839-f2ebc7923899-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.757248 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa122601-0140-4956-9ff0-2ad0e05ea423-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa122601-0140-4956-9ff0-2ad0e05ea423" (UID: "fa122601-0140-4956-9ff0-2ad0e05ea423"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.757520 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3934c7ad-2ee0-4ada-883f-3fceb84c8195-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3934c7ad-2ee0-4ada-883f-3fceb84c8195" (UID: "3934c7ad-2ee0-4ada-883f-3fceb84c8195"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.757879 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5534716d-e2f0-4e26-825c-8468b6832159-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5534716d-e2f0-4e26-825c-8468b6832159" (UID: "5534716d-e2f0-4e26-825c-8468b6832159"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.759992 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3934c7ad-2ee0-4ada-883f-3fceb84c8195-kube-api-access-pkmfx" (OuterVolumeSpecName: "kube-api-access-pkmfx") pod "3934c7ad-2ee0-4ada-883f-3fceb84c8195" (UID: "3934c7ad-2ee0-4ada-883f-3fceb84c8195"). InnerVolumeSpecName "kube-api-access-pkmfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.763733 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5534716d-e2f0-4e26-825c-8468b6832159-kube-api-access-xm27k" (OuterVolumeSpecName: "kube-api-access-xm27k") pod "5534716d-e2f0-4e26-825c-8468b6832159" (UID: "5534716d-e2f0-4e26-825c-8468b6832159"). InnerVolumeSpecName "kube-api-access-xm27k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.765256 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2bccf8-b293-40e5-a41b-78ec5826fc22-kube-api-access-tsjlp" (OuterVolumeSpecName: "kube-api-access-tsjlp") pod "7e2bccf8-b293-40e5-a41b-78ec5826fc22" (UID: "7e2bccf8-b293-40e5-a41b-78ec5826fc22"). InnerVolumeSpecName "kube-api-access-tsjlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.765874 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa122601-0140-4956-9ff0-2ad0e05ea423-kube-api-access-2x795" (OuterVolumeSpecName: "kube-api-access-2x795") pod "fa122601-0140-4956-9ff0-2ad0e05ea423" (UID: "fa122601-0140-4956-9ff0-2ad0e05ea423"). InnerVolumeSpecName "kube-api-access-2x795". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.859043 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3934c7ad-2ee0-4ada-883f-3fceb84c8195-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.859395 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x795\" (UniqueName: \"kubernetes.io/projected/fa122601-0140-4956-9ff0-2ad0e05ea423-kube-api-access-2x795\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.859412 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsjlp\" (UniqueName: \"kubernetes.io/projected/7e2bccf8-b293-40e5-a41b-78ec5826fc22-kube-api-access-tsjlp\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.859425 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5534716d-e2f0-4e26-825c-8468b6832159-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.859437 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm27k\" (UniqueName: \"kubernetes.io/projected/5534716d-e2f0-4e26-825c-8468b6832159-kube-api-access-xm27k\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.859450 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkmfx\" (UniqueName: \"kubernetes.io/projected/3934c7ad-2ee0-4ada-883f-3fceb84c8195-kube-api-access-pkmfx\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.859460 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa122601-0140-4956-9ff0-2ad0e05ea423-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:00 crc kubenswrapper[4848]: I1206 15:47:00.975529 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8715bfa-e1a1-4467-8543-6807e7facc8e" path="/var/lib/kubelet/pods/a8715bfa-e1a1-4467-8543-6807e7facc8e/volumes" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.129344 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-82ca-account-create-update-4szsh" event={"ID":"7e2bccf8-b293-40e5-a41b-78ec5826fc22","Type":"ContainerDied","Data":"9b2b060c03f0dc2a3eacc633985f2dd951f2fe0e5fa7a3396ac70ae5dd6fb37d"} Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.129386 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2b060c03f0dc2a3eacc633985f2dd951f2fe0e5fa7a3396ac70ae5dd6fb37d" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.129447 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-82ca-account-create-update-4szsh" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.133003 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dda76265-1c2c-4409-8460-99bc3ab509c6","Type":"ContainerStarted","Data":"8413d8e7b39ea3e60a00900db0021fc2f01a010ecbc78475875e6f6ff9166990"} Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.133427 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.134907 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75","Type":"ContainerStarted","Data":"35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085"} Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.135117 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.136134 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s8cj2" event={"ID":"e395dab1-bcea-4e77-8de4-cbd303e30216","Type":"ContainerDied","Data":"42348f63d6981d7efe35590a344a1e4f7489b2f8f750b6b9c2397ca6c16b50a8"} Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.136156 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42348f63d6981d7efe35590a344a1e4f7489b2f8f750b6b9c2397ca6c16b50a8" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.136182 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s8cj2" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.137410 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7488f" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.137419 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7488f" event={"ID":"f9c835a3-f7fd-439d-8839-f2ebc7923899","Type":"ContainerDied","Data":"c7cdc79a061bb526c24a31ae8995d2b76e9676fa9c880c841c2c34e071d4d8b1"} Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.137454 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7cdc79a061bb526c24a31ae8995d2b76e9676fa9c880c841c2c34e071d4d8b1" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.138760 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m2c2j" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.138763 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m2c2j" event={"ID":"5534716d-e2f0-4e26-825c-8468b6832159","Type":"ContainerDied","Data":"c677b926bfad19b5fc9ecbd75b0e55deef90a43f6ea57c6677f09c282d5976aa"} Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.138870 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c677b926bfad19b5fc9ecbd75b0e55deef90a43f6ea57c6677f09c282d5976aa" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.140042 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1ca1-account-create-update-s49tr" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.140042 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1ca1-account-create-update-s49tr" event={"ID":"fa122601-0140-4956-9ff0-2ad0e05ea423","Type":"ContainerDied","Data":"9ca2bfd66a63630a6384e1bbf6db10030f3d6aeb75429c365d3ca4950e8f7d42"} Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.140153 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ca2bfd66a63630a6384e1bbf6db10030f3d6aeb75429c365d3ca4950e8f7d42" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.141486 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1198-account-create-update-bbblr" event={"ID":"3934c7ad-2ee0-4ada-883f-3fceb84c8195","Type":"ContainerDied","Data":"217d5cd54f96158479b8e6680402c0197d922187e9ea65198a12017006eaff91"} Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.141510 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="217d5cd54f96158479b8e6680402c0197d922187e9ea65198a12017006eaff91" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.141531 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1198-account-create-update-bbblr" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.171133 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.815516608 podStartE2EDuration="50.171112906s" podCreationTimestamp="2025-12-06 15:46:11 +0000 UTC" firstStartedPulling="2025-12-06 15:46:13.481090292 +0000 UTC m=+1040.779101205" lastFinishedPulling="2025-12-06 15:46:25.83668659 +0000 UTC m=+1053.134697503" observedRunningTime="2025-12-06 15:47:01.169077041 +0000 UTC m=+1088.467087954" watchObservedRunningTime="2025-12-06 15:47:01.171112906 +0000 UTC m=+1088.469123819" Dec 06 15:47:01 crc kubenswrapper[4848]: I1206 15:47:01.199099 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.199082324 podStartE2EDuration="50.199082324s" podCreationTimestamp="2025-12-06 15:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:47:01.192593768 +0000 UTC m=+1088.490604681" watchObservedRunningTime="2025-12-06 15:47:01.199082324 +0000 UTC m=+1088.497093247" Dec 06 15:47:04 crc kubenswrapper[4848]: I1206 15:47:04.165014 4848 generic.go:334] "Generic (PLEG): container finished" podID="6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d" containerID="6d9cb4ec92d8d5fbdac9208298cb3991872399d8c5015c54e21da4d29217929f" exitCode=0 Dec 06 15:47:04 crc kubenswrapper[4848]: I1206 15:47:04.165111 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-99ss6" event={"ID":"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d","Type":"ContainerDied","Data":"6d9cb4ec92d8d5fbdac9208298cb3991872399d8c5015c54e21da4d29217929f"} Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.326833 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.334832 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14e2fe95-2aba-441c-85e1-ebd9bc0ba12f-etc-swift\") pod \"swift-storage-0\" (UID: \"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f\") " pod="openstack/swift-storage-0" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.468123 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.609082 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.630755 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-ring-data-devices\") pod \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.630879 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-dispersionconf\") pod \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.630928 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-swiftconf\") pod \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.630960 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-etc-swift\") pod \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.631001 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-combined-ca-bundle\") pod \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.631035 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pglzl\" (UniqueName: \"kubernetes.io/projected/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-kube-api-access-pglzl\") pod \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.631103 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-scripts\") pod \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\" (UID: \"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d\") " Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.632181 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d" (UID: "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.632643 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d" (UID: "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.638613 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d" (UID: "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.655947 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-kube-api-access-pglzl" (OuterVolumeSpecName: "kube-api-access-pglzl") pod "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d" (UID: "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d"). InnerVolumeSpecName "kube-api-access-pglzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.665636 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-scripts" (OuterVolumeSpecName: "scripts") pod "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d" (UID: "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.680530 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d" (UID: "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.682427 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d" (UID: "6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.733295 4848 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.733333 4848 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.733344 4848 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.733355 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.733369 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pglzl\" (UniqueName: \"kubernetes.io/projected/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-kube-api-access-pglzl\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.733384 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:05 crc kubenswrapper[4848]: I1206 15:47:05.733394 4848 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.146464 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 06 15:47:06 crc kubenswrapper[4848]: W1206 15:47:06.146767 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14e2fe95_2aba_441c_85e1_ebd9bc0ba12f.slice/crio-dc7d09ed4cd23e4705c36ccc91e7be2286a133b1dff0293d88113b07181398d0 WatchSource:0}: Error finding container dc7d09ed4cd23e4705c36ccc91e7be2286a133b1dff0293d88113b07181398d0: Status 404 returned error can't find the container with id dc7d09ed4cd23e4705c36ccc91e7be2286a133b1dff0293d88113b07181398d0 Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.180216 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-99ss6" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.180206 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-99ss6" event={"ID":"6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d","Type":"ContainerDied","Data":"1cc06108f73aa63f5e1a9ba1b1516156923ffddea90b9672394cd90ab8a96b1e"} Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.180277 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cc06108f73aa63f5e1a9ba1b1516156923ffddea90b9672394cd90ab8a96b1e" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.182306 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"dc7d09ed4cd23e4705c36ccc91e7be2286a133b1dff0293d88113b07181398d0"} Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.688804 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-l4vmt"] Dec 06 15:47:06 crc kubenswrapper[4848]: E1206 15:47:06.689384 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8715bfa-e1a1-4467-8543-6807e7facc8e" containerName="dnsmasq-dns" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689397 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8715bfa-e1a1-4467-8543-6807e7facc8e" containerName="dnsmasq-dns" Dec 06 15:47:06 crc kubenswrapper[4848]: E1206 15:47:06.689416 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8715bfa-e1a1-4467-8543-6807e7facc8e" containerName="init" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689421 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8715bfa-e1a1-4467-8543-6807e7facc8e" containerName="init" Dec 06 15:47:06 crc kubenswrapper[4848]: E1206 15:47:06.689431 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5534716d-e2f0-4e26-825c-8468b6832159" containerName="mariadb-database-create" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689437 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5534716d-e2f0-4e26-825c-8468b6832159" containerName="mariadb-database-create" Dec 06 15:47:06 crc kubenswrapper[4848]: E1206 15:47:06.689448 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e395dab1-bcea-4e77-8de4-cbd303e30216" containerName="mariadb-database-create" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689453 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e395dab1-bcea-4e77-8de4-cbd303e30216" containerName="mariadb-database-create" Dec 06 15:47:06 crc kubenswrapper[4848]: E1206 15:47:06.689461 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d" containerName="swift-ring-rebalance" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689466 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d" containerName="swift-ring-rebalance" Dec 06 15:47:06 crc kubenswrapper[4848]: E1206 15:47:06.689479 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c835a3-f7fd-439d-8839-f2ebc7923899" containerName="mariadb-database-create" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689486 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c835a3-f7fd-439d-8839-f2ebc7923899" containerName="mariadb-database-create" Dec 06 15:47:06 crc kubenswrapper[4848]: E1206 15:47:06.689497 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2bccf8-b293-40e5-a41b-78ec5826fc22" containerName="mariadb-account-create-update" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689503 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2bccf8-b293-40e5-a41b-78ec5826fc22" containerName="mariadb-account-create-update" Dec 06 15:47:06 crc kubenswrapper[4848]: E1206 15:47:06.689516 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa122601-0140-4956-9ff0-2ad0e05ea423" containerName="mariadb-account-create-update" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689522 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa122601-0140-4956-9ff0-2ad0e05ea423" containerName="mariadb-account-create-update" Dec 06 15:47:06 crc kubenswrapper[4848]: E1206 15:47:06.689531 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3934c7ad-2ee0-4ada-883f-3fceb84c8195" containerName="mariadb-account-create-update" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689537 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="3934c7ad-2ee0-4ada-883f-3fceb84c8195" containerName="mariadb-account-create-update" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689673 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa122601-0140-4956-9ff0-2ad0e05ea423" containerName="mariadb-account-create-update" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689710 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5534716d-e2f0-4e26-825c-8468b6832159" containerName="mariadb-database-create" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689727 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8715bfa-e1a1-4467-8543-6807e7facc8e" containerName="dnsmasq-dns" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689733 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="3934c7ad-2ee0-4ada-883f-3fceb84c8195" containerName="mariadb-account-create-update" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689742 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d" containerName="swift-ring-rebalance" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689748 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c835a3-f7fd-439d-8839-f2ebc7923899" containerName="mariadb-database-create" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689754 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e395dab1-bcea-4e77-8de4-cbd303e30216" containerName="mariadb-database-create" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.689760 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2bccf8-b293-40e5-a41b-78ec5826fc22" containerName="mariadb-account-create-update" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.690248 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.693548 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.693847 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-45bqd" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.700721 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l4vmt"] Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.850473 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-db-sync-config-data\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.850584 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-combined-ca-bundle\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.850618 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7nz9\" (UniqueName: \"kubernetes.io/projected/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-kube-api-access-k7nz9\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.850888 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-config-data\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.952324 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-db-sync-config-data\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.953332 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-combined-ca-bundle\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.953361 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7nz9\" (UniqueName: \"kubernetes.io/projected/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-kube-api-access-k7nz9\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.953567 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-config-data\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.957909 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-config-data\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.957949 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-combined-ca-bundle\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.965806 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-db-sync-config-data\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:06 crc kubenswrapper[4848]: I1206 15:47:06.977004 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7nz9\" (UniqueName: \"kubernetes.io/projected/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-kube-api-access-k7nz9\") pod \"glance-db-sync-l4vmt\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:07 crc kubenswrapper[4848]: I1206 15:47:07.006431 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:07 crc kubenswrapper[4848]: I1206 15:47:07.544320 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l4vmt"] Dec 06 15:47:08 crc kubenswrapper[4848]: I1206 15:47:08.197066 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l4vmt" event={"ID":"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930","Type":"ContainerStarted","Data":"3641f4e1d1c4641c84197390a7283864a3fb38df1c17091d0c7a2a3ae02f0501"} Dec 06 15:47:10 crc kubenswrapper[4848]: I1206 15:47:10.217977 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"7e72229e3dbfa7dd2ed156fb5dca8ea3e63d441140e02043b68deb149e36ec1b"} Dec 06 15:47:10 crc kubenswrapper[4848]: I1206 15:47:10.218837 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"cbd6cd831000337979ed911294974b9a06f79b7962235bb31dfe2c63f3b43032"} Dec 06 15:47:10 crc kubenswrapper[4848]: I1206 15:47:10.218858 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"be3b477b076dd28b9b77fbc7aeb39dd12ae306cd1031ed51307a6e515c1a6de9"} Dec 06 15:47:10 crc kubenswrapper[4848]: I1206 15:47:10.218875 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"dee7e189b1e6ab73a5759199e000ea91f7b2ad1bde611483bada87e31fbcb81e"} Dec 06 15:47:10 crc kubenswrapper[4848]: I1206 15:47:10.740479 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-g6wzf" podUID="93c0a1e4-91cd-4801-8439-a41fb872135f" containerName="ovn-controller" probeResult="failure" output=< Dec 06 15:47:10 crc kubenswrapper[4848]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 06 15:47:10 crc kubenswrapper[4848]: > Dec 06 15:47:10 crc kubenswrapper[4848]: I1206 15:47:10.741534 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:47:10 crc kubenswrapper[4848]: I1206 15:47:10.747646 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fcx5h" Dec 06 15:47:10 crc kubenswrapper[4848]: I1206 15:47:10.992510 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-g6wzf-config-jtnw9"] Dec 06 15:47:10 crc kubenswrapper[4848]: I1206 15:47:10.993801 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g6wzf-config-jtnw9"] Dec 06 15:47:10 crc kubenswrapper[4848]: I1206 15:47:10.993956 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:10 crc kubenswrapper[4848]: I1206 15:47:10.996186 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.124893 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.125635 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ggq9\" (UniqueName: \"kubernetes.io/projected/5314397f-9320-40b0-b18d-e9697cd268c8-kube-api-access-5ggq9\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.125674 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run-ovn\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.125714 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-additional-scripts\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.125750 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-log-ovn\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.125767 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-scripts\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.226751 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ggq9\" (UniqueName: \"kubernetes.io/projected/5314397f-9320-40b0-b18d-e9697cd268c8-kube-api-access-5ggq9\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.226899 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run-ovn\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.226938 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-additional-scripts\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.226985 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-log-ovn\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.227004 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-scripts\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.227140 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.227612 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.227931 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-log-ovn\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.228029 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run-ovn\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.228738 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-additional-scripts\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.229899 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-scripts\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.244953 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ggq9\" (UniqueName: \"kubernetes.io/projected/5314397f-9320-40b0-b18d-e9697cd268c8-kube-api-access-5ggq9\") pod \"ovn-controller-g6wzf-config-jtnw9\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:11 crc kubenswrapper[4848]: I1206 15:47:11.320430 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:12 crc kubenswrapper[4848]: I1206 15:47:12.655204 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g6wzf-config-jtnw9"] Dec 06 15:47:12 crc kubenswrapper[4848]: I1206 15:47:12.954877 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.244671 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"2fb54a81b44a555becf664af6d35590f080a5d3a61d0809221c2646bdb7f484f"} Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.244951 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"6c7d940d96877a8eba3fba39917d653cfd54537a53c4f0279c55bb5436fc39bb"} Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.245925 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g6wzf-config-jtnw9" event={"ID":"5314397f-9320-40b0-b18d-e9697cd268c8","Type":"ContainerStarted","Data":"f0d91ca37850697afc40ec8d3a6c06f0881bc4261990f92d9e123997e6bb6295"} Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.700681 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-89hsl"] Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.702244 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-89hsl" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.712831 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-89hsl"] Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.778637 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-operator-scripts\") pod \"cinder-db-create-89hsl\" (UID: \"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8\") " pod="openstack/cinder-db-create-89hsl" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.778740 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgzll\" (UniqueName: \"kubernetes.io/projected/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-kube-api-access-pgzll\") pod \"cinder-db-create-89hsl\" (UID: \"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8\") " pod="openstack/cinder-db-create-89hsl" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.799803 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6wzhg"] Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.801442 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6wzhg" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.824888 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8861-account-create-update-9w8cq"] Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.826058 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8861-account-create-update-9w8cq" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.832540 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.844024 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6wzhg"] Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.852529 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8861-account-create-update-9w8cq"] Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.880972 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmc2w\" (UniqueName: \"kubernetes.io/projected/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-kube-api-access-gmc2w\") pod \"barbican-db-create-6wzhg\" (UID: \"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d\") " pod="openstack/barbican-db-create-6wzhg" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.881057 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-operator-scripts\") pod \"barbican-db-create-6wzhg\" (UID: \"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d\") " pod="openstack/barbican-db-create-6wzhg" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.881132 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-operator-scripts\") pod \"cinder-db-create-89hsl\" (UID: \"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8\") " pod="openstack/cinder-db-create-89hsl" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.881172 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgzll\" (UniqueName: \"kubernetes.io/projected/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-kube-api-access-pgzll\") pod \"cinder-db-create-89hsl\" (UID: \"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8\") " pod="openstack/cinder-db-create-89hsl" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.882002 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-operator-scripts\") pod \"cinder-db-create-89hsl\" (UID: \"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8\") " pod="openstack/cinder-db-create-89hsl" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.929352 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgzll\" (UniqueName: \"kubernetes.io/projected/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-kube-api-access-pgzll\") pod \"cinder-db-create-89hsl\" (UID: \"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8\") " pod="openstack/cinder-db-create-89hsl" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.931127 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3c32-account-create-update-tkb9x"] Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.932974 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c32-account-create-update-tkb9x" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.937016 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.949598 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3c32-account-create-update-tkb9x"] Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.960166 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.985275 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdvdn\" (UniqueName: \"kubernetes.io/projected/58460aea-6f24-4346-b46d-7e0a7e0e0eca-kube-api-access-bdvdn\") pod \"cinder-8861-account-create-update-9w8cq\" (UID: \"58460aea-6f24-4346-b46d-7e0a7e0e0eca\") " pod="openstack/cinder-8861-account-create-update-9w8cq" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.985352 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmc2w\" (UniqueName: \"kubernetes.io/projected/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-kube-api-access-gmc2w\") pod \"barbican-db-create-6wzhg\" (UID: \"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d\") " pod="openstack/barbican-db-create-6wzhg" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.985391 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-operator-scripts\") pod \"barbican-db-create-6wzhg\" (UID: \"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d\") " pod="openstack/barbican-db-create-6wzhg" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.985421 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58460aea-6f24-4346-b46d-7e0a7e0e0eca-operator-scripts\") pod \"cinder-8861-account-create-update-9w8cq\" (UID: \"58460aea-6f24-4346-b46d-7e0a7e0e0eca\") " pod="openstack/cinder-8861-account-create-update-9w8cq" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.986166 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-operator-scripts\") pod \"barbican-db-create-6wzhg\" (UID: \"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d\") " pod="openstack/barbican-db-create-6wzhg" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.987107 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-p6tct"] Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.988066 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.992209 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.992369 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.992632 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-s6twx" Dec 06 15:47:13 crc kubenswrapper[4848]: I1206 15:47:13.992680 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.007169 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p6tct"] Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.014396 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmc2w\" (UniqueName: \"kubernetes.io/projected/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-kube-api-access-gmc2w\") pod \"barbican-db-create-6wzhg\" (UID: \"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d\") " pod="openstack/barbican-db-create-6wzhg" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.023279 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-89hsl" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.089063 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58460aea-6f24-4346-b46d-7e0a7e0e0eca-operator-scripts\") pod \"cinder-8861-account-create-update-9w8cq\" (UID: \"58460aea-6f24-4346-b46d-7e0a7e0e0eca\") " pod="openstack/cinder-8861-account-create-update-9w8cq" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.089123 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-combined-ca-bundle\") pod \"keystone-db-sync-p6tct\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.089185 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf9fl\" (UniqueName: \"kubernetes.io/projected/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-kube-api-access-rf9fl\") pod \"barbican-3c32-account-create-update-tkb9x\" (UID: \"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd\") " pod="openstack/barbican-3c32-account-create-update-tkb9x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.089238 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-config-data\") pod \"keystone-db-sync-p6tct\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.089267 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-operator-scripts\") pod \"barbican-3c32-account-create-update-tkb9x\" (UID: \"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd\") " pod="openstack/barbican-3c32-account-create-update-tkb9x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.089300 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdvdn\" (UniqueName: \"kubernetes.io/projected/58460aea-6f24-4346-b46d-7e0a7e0e0eca-kube-api-access-bdvdn\") pod \"cinder-8861-account-create-update-9w8cq\" (UID: \"58460aea-6f24-4346-b46d-7e0a7e0e0eca\") " pod="openstack/cinder-8861-account-create-update-9w8cq" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.089320 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtkmm\" (UniqueName: \"kubernetes.io/projected/7a892933-78a1-49d3-abee-8249ee831464-kube-api-access-xtkmm\") pod \"keystone-db-sync-p6tct\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.090722 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58460aea-6f24-4346-b46d-7e0a7e0e0eca-operator-scripts\") pod \"cinder-8861-account-create-update-9w8cq\" (UID: \"58460aea-6f24-4346-b46d-7e0a7e0e0eca\") " pod="openstack/cinder-8861-account-create-update-9w8cq" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.113602 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-vj76x"] Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.114642 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vj76x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.115854 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdvdn\" (UniqueName: \"kubernetes.io/projected/58460aea-6f24-4346-b46d-7e0a7e0e0eca-kube-api-access-bdvdn\") pod \"cinder-8861-account-create-update-9w8cq\" (UID: \"58460aea-6f24-4346-b46d-7e0a7e0e0eca\") " pod="openstack/cinder-8861-account-create-update-9w8cq" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.124057 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6wzhg" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.134859 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vj76x"] Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.142453 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8861-account-create-update-9w8cq" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.190382 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f2ab4c-f787-4a9c-acce-12e31518edcc-operator-scripts\") pod \"neutron-db-create-vj76x\" (UID: \"19f2ab4c-f787-4a9c-acce-12e31518edcc\") " pod="openstack/neutron-db-create-vj76x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.190787 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-combined-ca-bundle\") pod \"keystone-db-sync-p6tct\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.190870 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf9fl\" (UniqueName: \"kubernetes.io/projected/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-kube-api-access-rf9fl\") pod \"barbican-3c32-account-create-update-tkb9x\" (UID: \"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd\") " pod="openstack/barbican-3c32-account-create-update-tkb9x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.190910 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-config-data\") pod \"keystone-db-sync-p6tct\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.190932 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-operator-scripts\") pod \"barbican-3c32-account-create-update-tkb9x\" (UID: \"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd\") " pod="openstack/barbican-3c32-account-create-update-tkb9x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.190957 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj5c7\" (UniqueName: \"kubernetes.io/projected/19f2ab4c-f787-4a9c-acce-12e31518edcc-kube-api-access-bj5c7\") pod \"neutron-db-create-vj76x\" (UID: \"19f2ab4c-f787-4a9c-acce-12e31518edcc\") " pod="openstack/neutron-db-create-vj76x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.190975 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtkmm\" (UniqueName: \"kubernetes.io/projected/7a892933-78a1-49d3-abee-8249ee831464-kube-api-access-xtkmm\") pod \"keystone-db-sync-p6tct\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.195743 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-config-data\") pod \"keystone-db-sync-p6tct\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.198275 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-operator-scripts\") pod \"barbican-3c32-account-create-update-tkb9x\" (UID: \"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd\") " pod="openstack/barbican-3c32-account-create-update-tkb9x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.198751 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-combined-ca-bundle\") pod \"keystone-db-sync-p6tct\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.209047 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtkmm\" (UniqueName: \"kubernetes.io/projected/7a892933-78a1-49d3-abee-8249ee831464-kube-api-access-xtkmm\") pod \"keystone-db-sync-p6tct\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.213279 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf9fl\" (UniqueName: \"kubernetes.io/projected/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-kube-api-access-rf9fl\") pod \"barbican-3c32-account-create-update-tkb9x\" (UID: \"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd\") " pod="openstack/barbican-3c32-account-create-update-tkb9x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.253074 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3f66-account-create-update-rnd8v"] Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.254211 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3f66-account-create-update-rnd8v" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.256818 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.262409 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3f66-account-create-update-rnd8v"] Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.262896 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g6wzf-config-jtnw9" event={"ID":"5314397f-9320-40b0-b18d-e9697cd268c8","Type":"ContainerStarted","Data":"d51bf060384adb2b2a7a5f940a6a8aa6f2f3d4cfe023bc723e810f2b1aedf521"} Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.269171 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c32-account-create-update-tkb9x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.291593 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"5a4f2fb2d6bf3bef706243d0a25b0ae474bc1342ca0a3aadafe8b70d014f1d50"} Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.291788 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"6440c03f106b00c20a1fcd98ea54b263010b148cc9ba7f28eae546f74bfc3456"} Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.294222 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj5c7\" (UniqueName: \"kubernetes.io/projected/19f2ab4c-f787-4a9c-acce-12e31518edcc-kube-api-access-bj5c7\") pod \"neutron-db-create-vj76x\" (UID: \"19f2ab4c-f787-4a9c-acce-12e31518edcc\") " pod="openstack/neutron-db-create-vj76x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.294288 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f2ab4c-f787-4a9c-acce-12e31518edcc-operator-scripts\") pod \"neutron-db-create-vj76x\" (UID: \"19f2ab4c-f787-4a9c-acce-12e31518edcc\") " pod="openstack/neutron-db-create-vj76x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.295013 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f2ab4c-f787-4a9c-acce-12e31518edcc-operator-scripts\") pod \"neutron-db-create-vj76x\" (UID: \"19f2ab4c-f787-4a9c-acce-12e31518edcc\") " pod="openstack/neutron-db-create-vj76x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.317011 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj5c7\" (UniqueName: \"kubernetes.io/projected/19f2ab4c-f787-4a9c-acce-12e31518edcc-kube-api-access-bj5c7\") pod \"neutron-db-create-vj76x\" (UID: \"19f2ab4c-f787-4a9c-acce-12e31518edcc\") " pod="openstack/neutron-db-create-vj76x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.333951 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-g6wzf-config-jtnw9" podStartSLOduration=4.333930147 podStartE2EDuration="4.333930147s" podCreationTimestamp="2025-12-06 15:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:47:14.310201424 +0000 UTC m=+1101.608212337" watchObservedRunningTime="2025-12-06 15:47:14.333930147 +0000 UTC m=+1101.631941060" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.340879 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.395813 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2xgs\" (UniqueName: \"kubernetes.io/projected/242c04cd-53ea-4612-bc7d-e7e7d05700ed-kube-api-access-d2xgs\") pod \"neutron-3f66-account-create-update-rnd8v\" (UID: \"242c04cd-53ea-4612-bc7d-e7e7d05700ed\") " pod="openstack/neutron-3f66-account-create-update-rnd8v" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.396160 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/242c04cd-53ea-4612-bc7d-e7e7d05700ed-operator-scripts\") pod \"neutron-3f66-account-create-update-rnd8v\" (UID: \"242c04cd-53ea-4612-bc7d-e7e7d05700ed\") " pod="openstack/neutron-3f66-account-create-update-rnd8v" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.439395 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vj76x" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.510477 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2xgs\" (UniqueName: \"kubernetes.io/projected/242c04cd-53ea-4612-bc7d-e7e7d05700ed-kube-api-access-d2xgs\") pod \"neutron-3f66-account-create-update-rnd8v\" (UID: \"242c04cd-53ea-4612-bc7d-e7e7d05700ed\") " pod="openstack/neutron-3f66-account-create-update-rnd8v" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.510574 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/242c04cd-53ea-4612-bc7d-e7e7d05700ed-operator-scripts\") pod \"neutron-3f66-account-create-update-rnd8v\" (UID: \"242c04cd-53ea-4612-bc7d-e7e7d05700ed\") " pod="openstack/neutron-3f66-account-create-update-rnd8v" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.511563 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/242c04cd-53ea-4612-bc7d-e7e7d05700ed-operator-scripts\") pod \"neutron-3f66-account-create-update-rnd8v\" (UID: \"242c04cd-53ea-4612-bc7d-e7e7d05700ed\") " pod="openstack/neutron-3f66-account-create-update-rnd8v" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.533915 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-89hsl"] Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.537256 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2xgs\" (UniqueName: \"kubernetes.io/projected/242c04cd-53ea-4612-bc7d-e7e7d05700ed-kube-api-access-d2xgs\") pod \"neutron-3f66-account-create-update-rnd8v\" (UID: \"242c04cd-53ea-4612-bc7d-e7e7d05700ed\") " pod="openstack/neutron-3f66-account-create-update-rnd8v" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.581045 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3f66-account-create-update-rnd8v" Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.777668 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6wzhg"] Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.920878 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3c32-account-create-update-tkb9x"] Dec 06 15:47:14 crc kubenswrapper[4848]: W1206 15:47:14.934865 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1ab4bb7_ef7d_44ab_a5b4_5af92e5fdcbd.slice/crio-6baee8e249b43025a175a66255690f521d985da41e620ce1ad99ef73a1f4a11d WatchSource:0}: Error finding container 6baee8e249b43025a175a66255690f521d985da41e620ce1ad99ef73a1f4a11d: Status 404 returned error can't find the container with id 6baee8e249b43025a175a66255690f521d985da41e620ce1ad99ef73a1f4a11d Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.941329 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8861-account-create-update-9w8cq"] Dec 06 15:47:14 crc kubenswrapper[4848]: I1206 15:47:14.951848 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p6tct"] Dec 06 15:47:14 crc kubenswrapper[4848]: W1206 15:47:14.955860 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58460aea_6f24_4346_b46d_7e0a7e0e0eca.slice/crio-0b5d9eba076ddfb0a8b1d5daa41f011df23f2091ed95a8e5a331df3ba893ead8 WatchSource:0}: Error finding container 0b5d9eba076ddfb0a8b1d5daa41f011df23f2091ed95a8e5a331df3ba893ead8: Status 404 returned error can't find the container with id 0b5d9eba076ddfb0a8b1d5daa41f011df23f2091ed95a8e5a331df3ba893ead8 Dec 06 15:47:14 crc kubenswrapper[4848]: W1206 15:47:14.959230 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a892933_78a1_49d3_abee_8249ee831464.slice/crio-0542c163a329e2312b14975538bfd8f9238593cc3342f90af0f26d12ce0d6f87 WatchSource:0}: Error finding container 0542c163a329e2312b14975538bfd8f9238593cc3342f90af0f26d12ce0d6f87: Status 404 returned error can't find the container with id 0542c163a329e2312b14975538bfd8f9238593cc3342f90af0f26d12ce0d6f87 Dec 06 15:47:15 crc kubenswrapper[4848]: I1206 15:47:15.128094 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vj76x"] Dec 06 15:47:15 crc kubenswrapper[4848]: I1206 15:47:15.139373 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3f66-account-create-update-rnd8v"] Dec 06 15:47:15 crc kubenswrapper[4848]: W1206 15:47:15.141007 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19f2ab4c_f787_4a9c_acce_12e31518edcc.slice/crio-abec8de908d26277c72a5614bb29ff0c6e2abb412b1a6bc30106b1c815de4032 WatchSource:0}: Error finding container abec8de908d26277c72a5614bb29ff0c6e2abb412b1a6bc30106b1c815de4032: Status 404 returned error can't find the container with id abec8de908d26277c72a5614bb29ff0c6e2abb412b1a6bc30106b1c815de4032 Dec 06 15:47:15 crc kubenswrapper[4848]: I1206 15:47:15.304118 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8861-account-create-update-9w8cq" event={"ID":"58460aea-6f24-4346-b46d-7e0a7e0e0eca","Type":"ContainerStarted","Data":"0b5d9eba076ddfb0a8b1d5daa41f011df23f2091ed95a8e5a331df3ba893ead8"} Dec 06 15:47:15 crc kubenswrapper[4848]: I1206 15:47:15.306076 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6wzhg" event={"ID":"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d","Type":"ContainerStarted","Data":"814ea72d1a9a939c3264392c69c1b5ca05aba9266b8c74ff21b89c8791db9dfb"} Dec 06 15:47:15 crc kubenswrapper[4848]: I1206 15:47:15.318322 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c32-account-create-update-tkb9x" event={"ID":"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd","Type":"ContainerStarted","Data":"6baee8e249b43025a175a66255690f521d985da41e620ce1ad99ef73a1f4a11d"} Dec 06 15:47:15 crc kubenswrapper[4848]: I1206 15:47:15.322486 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3f66-account-create-update-rnd8v" event={"ID":"242c04cd-53ea-4612-bc7d-e7e7d05700ed","Type":"ContainerStarted","Data":"c2b2ff70921361c3734ca51f31f189a7c8185b9fa6dbb021b70035529f2e474b"} Dec 06 15:47:15 crc kubenswrapper[4848]: I1206 15:47:15.324360 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p6tct" event={"ID":"7a892933-78a1-49d3-abee-8249ee831464","Type":"ContainerStarted","Data":"0542c163a329e2312b14975538bfd8f9238593cc3342f90af0f26d12ce0d6f87"} Dec 06 15:47:15 crc kubenswrapper[4848]: I1206 15:47:15.326936 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-89hsl" event={"ID":"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8","Type":"ContainerStarted","Data":"f86fbf027e25ebf86e7a427147f76e41a5f77251d2a0c9958fabc86ed9203c59"} Dec 06 15:47:15 crc kubenswrapper[4848]: I1206 15:47:15.328290 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vj76x" event={"ID":"19f2ab4c-f787-4a9c-acce-12e31518edcc","Type":"ContainerStarted","Data":"abec8de908d26277c72a5614bb29ff0c6e2abb412b1a6bc30106b1c815de4032"} Dec 06 15:47:15 crc kubenswrapper[4848]: I1206 15:47:15.753112 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-g6wzf" Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.338642 4848 generic.go:334] "Generic (PLEG): container finished" podID="a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd" containerID="738119c3dc529bf6faa01ad310109bd04fa86661da868f2d4884b217ca9f9d88" exitCode=0 Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.338735 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c32-account-create-update-tkb9x" event={"ID":"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd","Type":"ContainerDied","Data":"738119c3dc529bf6faa01ad310109bd04fa86661da868f2d4884b217ca9f9d88"} Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.340566 4848 generic.go:334] "Generic (PLEG): container finished" podID="242c04cd-53ea-4612-bc7d-e7e7d05700ed" containerID="54d0c6bb49657f48afeb845d96777d9f1dcf881521a8dca5bb5bb6ec0d6922e7" exitCode=0 Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.340612 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3f66-account-create-update-rnd8v" event={"ID":"242c04cd-53ea-4612-bc7d-e7e7d05700ed","Type":"ContainerDied","Data":"54d0c6bb49657f48afeb845d96777d9f1dcf881521a8dca5bb5bb6ec0d6922e7"} Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.341899 4848 generic.go:334] "Generic (PLEG): container finished" podID="ce4a5fd6-456b-4eb4-973a-a9cf690e9be8" containerID="b5dc10be32b6eac52d81b1eeb98008339d651ea544b0919f5080ebd870bd21ce" exitCode=0 Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.341949 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-89hsl" event={"ID":"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8","Type":"ContainerDied","Data":"b5dc10be32b6eac52d81b1eeb98008339d651ea544b0919f5080ebd870bd21ce"} Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.343250 4848 generic.go:334] "Generic (PLEG): container finished" podID="19f2ab4c-f787-4a9c-acce-12e31518edcc" containerID="3939b923cc20641397bfd5785f5c7a548310b3d92c805ac9d6c52af866e89ac2" exitCode=0 Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.343313 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vj76x" event={"ID":"19f2ab4c-f787-4a9c-acce-12e31518edcc","Type":"ContainerDied","Data":"3939b923cc20641397bfd5785f5c7a548310b3d92c805ac9d6c52af866e89ac2"} Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.344741 4848 generic.go:334] "Generic (PLEG): container finished" podID="58460aea-6f24-4346-b46d-7e0a7e0e0eca" containerID="62c4f4edbb0973c5f6aed65262e9556a3716e4ff42d3616c81a949895967746d" exitCode=0 Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.344808 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8861-account-create-update-9w8cq" event={"ID":"58460aea-6f24-4346-b46d-7e0a7e0e0eca","Type":"ContainerDied","Data":"62c4f4edbb0973c5f6aed65262e9556a3716e4ff42d3616c81a949895967746d"} Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.350058 4848 generic.go:334] "Generic (PLEG): container finished" podID="5314397f-9320-40b0-b18d-e9697cd268c8" containerID="d51bf060384adb2b2a7a5f940a6a8aa6f2f3d4cfe023bc723e810f2b1aedf521" exitCode=0 Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.350127 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g6wzf-config-jtnw9" event={"ID":"5314397f-9320-40b0-b18d-e9697cd268c8","Type":"ContainerDied","Data":"d51bf060384adb2b2a7a5f940a6a8aa6f2f3d4cfe023bc723e810f2b1aedf521"} Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.352559 4848 generic.go:334] "Generic (PLEG): container finished" podID="6eae0c0b-0b9e-4fc7-b9b6-21b14900096d" containerID="b2153f3a84bf7bf3848877a9b692a4d828ae0f91c2bcc800ee8fcc133d118375" exitCode=0 Dec 06 15:47:16 crc kubenswrapper[4848]: I1206 15:47:16.352597 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6wzhg" event={"ID":"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d","Type":"ContainerDied","Data":"b2153f3a84bf7bf3848877a9b692a4d828ae0f91c2bcc800ee8fcc133d118375"} Dec 06 15:47:17 crc kubenswrapper[4848]: I1206 15:47:17.150624 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:47:17 crc kubenswrapper[4848]: I1206 15:47:17.150677 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.419783 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3f66-account-create-update-rnd8v" event={"ID":"242c04cd-53ea-4612-bc7d-e7e7d05700ed","Type":"ContainerDied","Data":"c2b2ff70921361c3734ca51f31f189a7c8185b9fa6dbb021b70035529f2e474b"} Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.420262 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2b2ff70921361c3734ca51f31f189a7c8185b9fa6dbb021b70035529f2e474b" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.421305 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-89hsl" event={"ID":"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8","Type":"ContainerDied","Data":"f86fbf027e25ebf86e7a427147f76e41a5f77251d2a0c9958fabc86ed9203c59"} Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.421350 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f86fbf027e25ebf86e7a427147f76e41a5f77251d2a0c9958fabc86ed9203c59" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.422789 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vj76x" event={"ID":"19f2ab4c-f787-4a9c-acce-12e31518edcc","Type":"ContainerDied","Data":"abec8de908d26277c72a5614bb29ff0c6e2abb412b1a6bc30106b1c815de4032"} Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.422813 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abec8de908d26277c72a5614bb29ff0c6e2abb412b1a6bc30106b1c815de4032" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.424103 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8861-account-create-update-9w8cq" event={"ID":"58460aea-6f24-4346-b46d-7e0a7e0e0eca","Type":"ContainerDied","Data":"0b5d9eba076ddfb0a8b1d5daa41f011df23f2091ed95a8e5a331df3ba893ead8"} Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.424147 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b5d9eba076ddfb0a8b1d5daa41f011df23f2091ed95a8e5a331df3ba893ead8" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.425996 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g6wzf-config-jtnw9" event={"ID":"5314397f-9320-40b0-b18d-e9697cd268c8","Type":"ContainerDied","Data":"f0d91ca37850697afc40ec8d3a6c06f0881bc4261990f92d9e123997e6bb6295"} Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.426022 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0d91ca37850697afc40ec8d3a6c06f0881bc4261990f92d9e123997e6bb6295" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.427163 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6wzhg" event={"ID":"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d","Type":"ContainerDied","Data":"814ea72d1a9a939c3264392c69c1b5ca05aba9266b8c74ff21b89c8791db9dfb"} Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.427184 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="814ea72d1a9a939c3264392c69c1b5ca05aba9266b8c74ff21b89c8791db9dfb" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.429095 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c32-account-create-update-tkb9x" event={"ID":"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd","Type":"ContainerDied","Data":"6baee8e249b43025a175a66255690f521d985da41e620ce1ad99ef73a1f4a11d"} Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.429111 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6baee8e249b43025a175a66255690f521d985da41e620ce1ad99ef73a1f4a11d" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.516276 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vj76x" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.522420 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-89hsl" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.536846 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.542009 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6wzhg" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.554413 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3f66-account-create-update-rnd8v" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.561294 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c32-account-create-update-tkb9x" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.575364 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8861-account-create-update-9w8cq" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669403 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-additional-scripts\") pod \"5314397f-9320-40b0-b18d-e9697cd268c8\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669460 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-log-ovn\") pod \"5314397f-9320-40b0-b18d-e9697cd268c8\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669500 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run-ovn\") pod \"5314397f-9320-40b0-b18d-e9697cd268c8\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669534 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run\") pod \"5314397f-9320-40b0-b18d-e9697cd268c8\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669570 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-operator-scripts\") pod \"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d\" (UID: \"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669586 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgzll\" (UniqueName: \"kubernetes.io/projected/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-kube-api-access-pgzll\") pod \"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8\" (UID: \"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669605 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58460aea-6f24-4346-b46d-7e0a7e0e0eca-operator-scripts\") pod \"58460aea-6f24-4346-b46d-7e0a7e0e0eca\" (UID: \"58460aea-6f24-4346-b46d-7e0a7e0e0eca\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669641 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmc2w\" (UniqueName: \"kubernetes.io/projected/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-kube-api-access-gmc2w\") pod \"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d\" (UID: \"6eae0c0b-0b9e-4fc7-b9b6-21b14900096d\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669658 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2xgs\" (UniqueName: \"kubernetes.io/projected/242c04cd-53ea-4612-bc7d-e7e7d05700ed-kube-api-access-d2xgs\") pod \"242c04cd-53ea-4612-bc7d-e7e7d05700ed\" (UID: \"242c04cd-53ea-4612-bc7d-e7e7d05700ed\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669679 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj5c7\" (UniqueName: \"kubernetes.io/projected/19f2ab4c-f787-4a9c-acce-12e31518edcc-kube-api-access-bj5c7\") pod \"19f2ab4c-f787-4a9c-acce-12e31518edcc\" (UID: \"19f2ab4c-f787-4a9c-acce-12e31518edcc\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669714 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/242c04cd-53ea-4612-bc7d-e7e7d05700ed-operator-scripts\") pod \"242c04cd-53ea-4612-bc7d-e7e7d05700ed\" (UID: \"242c04cd-53ea-4612-bc7d-e7e7d05700ed\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669744 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-operator-scripts\") pod \"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd\" (UID: \"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669767 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f2ab4c-f787-4a9c-acce-12e31518edcc-operator-scripts\") pod \"19f2ab4c-f787-4a9c-acce-12e31518edcc\" (UID: \"19f2ab4c-f787-4a9c-acce-12e31518edcc\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669822 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdvdn\" (UniqueName: \"kubernetes.io/projected/58460aea-6f24-4346-b46d-7e0a7e0e0eca-kube-api-access-bdvdn\") pod \"58460aea-6f24-4346-b46d-7e0a7e0e0eca\" (UID: \"58460aea-6f24-4346-b46d-7e0a7e0e0eca\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669858 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ggq9\" (UniqueName: \"kubernetes.io/projected/5314397f-9320-40b0-b18d-e9697cd268c8-kube-api-access-5ggq9\") pod \"5314397f-9320-40b0-b18d-e9697cd268c8\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669884 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-scripts\") pod \"5314397f-9320-40b0-b18d-e9697cd268c8\" (UID: \"5314397f-9320-40b0-b18d-e9697cd268c8\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669910 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf9fl\" (UniqueName: \"kubernetes.io/projected/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-kube-api-access-rf9fl\") pod \"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd\" (UID: \"a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.669929 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-operator-scripts\") pod \"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8\" (UID: \"ce4a5fd6-456b-4eb4-973a-a9cf690e9be8\") " Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.670507 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5314397f-9320-40b0-b18d-e9697cd268c8" (UID: "5314397f-9320-40b0-b18d-e9697cd268c8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.670509 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run" (OuterVolumeSpecName: "var-run") pod "5314397f-9320-40b0-b18d-e9697cd268c8" (UID: "5314397f-9320-40b0-b18d-e9697cd268c8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.670571 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5314397f-9320-40b0-b18d-e9697cd268c8" (UID: "5314397f-9320-40b0-b18d-e9697cd268c8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.671108 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5314397f-9320-40b0-b18d-e9697cd268c8" (UID: "5314397f-9320-40b0-b18d-e9697cd268c8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.671155 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58460aea-6f24-4346-b46d-7e0a7e0e0eca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58460aea-6f24-4346-b46d-7e0a7e0e0eca" (UID: "58460aea-6f24-4346-b46d-7e0a7e0e0eca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.671414 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce4a5fd6-456b-4eb4-973a-a9cf690e9be8" (UID: "ce4a5fd6-456b-4eb4-973a-a9cf690e9be8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.671597 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6eae0c0b-0b9e-4fc7-b9b6-21b14900096d" (UID: "6eae0c0b-0b9e-4fc7-b9b6-21b14900096d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.672448 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd" (UID: "a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.672925 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/242c04cd-53ea-4612-bc7d-e7e7d05700ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "242c04cd-53ea-4612-bc7d-e7e7d05700ed" (UID: "242c04cd-53ea-4612-bc7d-e7e7d05700ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.673809 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-scripts" (OuterVolumeSpecName: "scripts") pod "5314397f-9320-40b0-b18d-e9697cd268c8" (UID: "5314397f-9320-40b0-b18d-e9697cd268c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.675418 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f2ab4c-f787-4a9c-acce-12e31518edcc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19f2ab4c-f787-4a9c-acce-12e31518edcc" (UID: "19f2ab4c-f787-4a9c-acce-12e31518edcc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.678166 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/242c04cd-53ea-4612-bc7d-e7e7d05700ed-kube-api-access-d2xgs" (OuterVolumeSpecName: "kube-api-access-d2xgs") pod "242c04cd-53ea-4612-bc7d-e7e7d05700ed" (UID: "242c04cd-53ea-4612-bc7d-e7e7d05700ed"). InnerVolumeSpecName "kube-api-access-d2xgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.680774 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-kube-api-access-rf9fl" (OuterVolumeSpecName: "kube-api-access-rf9fl") pod "a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd" (UID: "a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd"). InnerVolumeSpecName "kube-api-access-rf9fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.680933 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-kube-api-access-gmc2w" (OuterVolumeSpecName: "kube-api-access-gmc2w") pod "6eae0c0b-0b9e-4fc7-b9b6-21b14900096d" (UID: "6eae0c0b-0b9e-4fc7-b9b6-21b14900096d"). InnerVolumeSpecName "kube-api-access-gmc2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.682648 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f2ab4c-f787-4a9c-acce-12e31518edcc-kube-api-access-bj5c7" (OuterVolumeSpecName: "kube-api-access-bj5c7") pod "19f2ab4c-f787-4a9c-acce-12e31518edcc" (UID: "19f2ab4c-f787-4a9c-acce-12e31518edcc"). InnerVolumeSpecName "kube-api-access-bj5c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.682664 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-kube-api-access-pgzll" (OuterVolumeSpecName: "kube-api-access-pgzll") pod "ce4a5fd6-456b-4eb4-973a-a9cf690e9be8" (UID: "ce4a5fd6-456b-4eb4-973a-a9cf690e9be8"). InnerVolumeSpecName "kube-api-access-pgzll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.684422 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5314397f-9320-40b0-b18d-e9697cd268c8-kube-api-access-5ggq9" (OuterVolumeSpecName: "kube-api-access-5ggq9") pod "5314397f-9320-40b0-b18d-e9697cd268c8" (UID: "5314397f-9320-40b0-b18d-e9697cd268c8"). InnerVolumeSpecName "kube-api-access-5ggq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.695941 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58460aea-6f24-4346-b46d-7e0a7e0e0eca-kube-api-access-bdvdn" (OuterVolumeSpecName: "kube-api-access-bdvdn") pod "58460aea-6f24-4346-b46d-7e0a7e0e0eca" (UID: "58460aea-6f24-4346-b46d-7e0a7e0e0eca"). InnerVolumeSpecName "kube-api-access-bdvdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.771957 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdvdn\" (UniqueName: \"kubernetes.io/projected/58460aea-6f24-4346-b46d-7e0a7e0e0eca-kube-api-access-bdvdn\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.771983 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ggq9\" (UniqueName: \"kubernetes.io/projected/5314397f-9320-40b0-b18d-e9697cd268c8-kube-api-access-5ggq9\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772274 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772295 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf9fl\" (UniqueName: \"kubernetes.io/projected/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-kube-api-access-rf9fl\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772308 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772318 4848 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5314397f-9320-40b0-b18d-e9697cd268c8-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772329 4848 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772340 4848 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772349 4848 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5314397f-9320-40b0-b18d-e9697cd268c8-var-run\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772357 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772365 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgzll\" (UniqueName: \"kubernetes.io/projected/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8-kube-api-access-pgzll\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772373 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58460aea-6f24-4346-b46d-7e0a7e0e0eca-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772381 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmc2w\" (UniqueName: \"kubernetes.io/projected/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d-kube-api-access-gmc2w\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772806 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2xgs\" (UniqueName: \"kubernetes.io/projected/242c04cd-53ea-4612-bc7d-e7e7d05700ed-kube-api-access-d2xgs\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772823 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj5c7\" (UniqueName: \"kubernetes.io/projected/19f2ab4c-f787-4a9c-acce-12e31518edcc-kube-api-access-bj5c7\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772831 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/242c04cd-53ea-4612-bc7d-e7e7d05700ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772839 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:23 crc kubenswrapper[4848]: I1206 15:47:23.772861 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f2ab4c-f787-4a9c-acce-12e31518edcc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.457229 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6wzhg" Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.457892 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g6wzf-config-jtnw9" Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.458249 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"4151bcf17adf4977aa2ec2705f7f53e98def142777c98b1c18f01b2badc9415b"} Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.458310 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"03fb5e5fdde0be12ebf0e41487e8bfe3c2aa340af3e38336ecc2ff153aec5cc1"} Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.458320 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"ff0f72a3e1908e87b44a1892274f9d6de73a0b290e22b8de83f5d792c9b7d136"} Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.458406 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vj76x" Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.461488 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8861-account-create-update-9w8cq" Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.461548 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c32-account-create-update-tkb9x" Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.461490 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3f66-account-create-update-rnd8v" Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.461659 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-89hsl" Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.702330 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-g6wzf-config-jtnw9"] Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.718504 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-g6wzf-config-jtnw9"] Dec 06 15:47:24 crc kubenswrapper[4848]: I1206 15:47:24.980030 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5314397f-9320-40b0-b18d-e9697cd268c8" path="/var/lib/kubelet/pods/5314397f-9320-40b0-b18d-e9697cd268c8/volumes" Dec 06 15:47:25 crc kubenswrapper[4848]: I1206 15:47:25.469048 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l4vmt" event={"ID":"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930","Type":"ContainerStarted","Data":"ce0e406f279064825e9529bec9f70f2aa4138cb90d6c059852d844bef90c3485"} Dec 06 15:47:25 crc kubenswrapper[4848]: I1206 15:47:25.477277 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"6b7425c3bb598dd4615e97e1a64e5222f9a755fe569838e3e7d60f2a2ec30ecb"} Dec 06 15:47:25 crc kubenswrapper[4848]: I1206 15:47:25.477323 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"2b45517a7d6bcfb6f380a20555695e2d02b305f9de387c9c6ae3298d91db9dbd"} Dec 06 15:47:25 crc kubenswrapper[4848]: I1206 15:47:25.477359 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"569332b0639d2604d690187e72efd8736600e06c1635f121cc70d1ff75af7e5e"} Dec 06 15:47:25 crc kubenswrapper[4848]: I1206 15:47:25.484874 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-l4vmt" podStartSLOduration=3.105628641 podStartE2EDuration="19.484858353s" podCreationTimestamp="2025-12-06 15:47:06 +0000 UTC" firstStartedPulling="2025-12-06 15:47:07.543219076 +0000 UTC m=+1094.841229989" lastFinishedPulling="2025-12-06 15:47:23.922448788 +0000 UTC m=+1111.220459701" observedRunningTime="2025-12-06 15:47:25.483840936 +0000 UTC m=+1112.781851849" watchObservedRunningTime="2025-12-06 15:47:25.484858353 +0000 UTC m=+1112.782869266" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.569617 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14e2fe95-2aba-441c-85e1-ebd9bc0ba12f","Type":"ContainerStarted","Data":"73b59b18ffaaf8ececba846ac3468173f9818d4c0ae9cfd5c0e9ce654b8cf438"} Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.572319 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p6tct" event={"ID":"7a892933-78a1-49d3-abee-8249ee831464","Type":"ContainerStarted","Data":"c6081ff52d7a92534d5cd0e5ad138d067d5291ef1ffce09cd5888834425209f1"} Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.608004 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=28.852880971 podStartE2EDuration="46.607983775s" podCreationTimestamp="2025-12-06 15:46:48 +0000 UTC" firstStartedPulling="2025-12-06 15:47:06.150351484 +0000 UTC m=+1093.448362397" lastFinishedPulling="2025-12-06 15:47:23.905454288 +0000 UTC m=+1111.203465201" observedRunningTime="2025-12-06 15:47:34.603815162 +0000 UTC m=+1121.901826075" watchObservedRunningTime="2025-12-06 15:47:34.607983775 +0000 UTC m=+1121.905994688" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.628581 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-p6tct" podStartSLOduration=2.604092994 podStartE2EDuration="21.628562252s" podCreationTimestamp="2025-12-06 15:47:13 +0000 UTC" firstStartedPulling="2025-12-06 15:47:14.968095831 +0000 UTC m=+1102.266106744" lastFinishedPulling="2025-12-06 15:47:33.992565059 +0000 UTC m=+1121.290576002" observedRunningTime="2025-12-06 15:47:34.624844432 +0000 UTC m=+1121.922855375" watchObservedRunningTime="2025-12-06 15:47:34.628562252 +0000 UTC m=+1121.926573165" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874256 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-x7nkg"] Dec 06 15:47:34 crc kubenswrapper[4848]: E1206 15:47:34.874627 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4a5fd6-456b-4eb4-973a-a9cf690e9be8" containerName="mariadb-database-create" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874644 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4a5fd6-456b-4eb4-973a-a9cf690e9be8" containerName="mariadb-database-create" Dec 06 15:47:34 crc kubenswrapper[4848]: E1206 15:47:34.874657 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5314397f-9320-40b0-b18d-e9697cd268c8" containerName="ovn-config" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874663 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5314397f-9320-40b0-b18d-e9697cd268c8" containerName="ovn-config" Dec 06 15:47:34 crc kubenswrapper[4848]: E1206 15:47:34.874678 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58460aea-6f24-4346-b46d-7e0a7e0e0eca" containerName="mariadb-account-create-update" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874686 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="58460aea-6f24-4346-b46d-7e0a7e0e0eca" containerName="mariadb-account-create-update" Dec 06 15:47:34 crc kubenswrapper[4848]: E1206 15:47:34.874716 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd" containerName="mariadb-account-create-update" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874722 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd" containerName="mariadb-account-create-update" Dec 06 15:47:34 crc kubenswrapper[4848]: E1206 15:47:34.874731 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eae0c0b-0b9e-4fc7-b9b6-21b14900096d" containerName="mariadb-database-create" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874737 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eae0c0b-0b9e-4fc7-b9b6-21b14900096d" containerName="mariadb-database-create" Dec 06 15:47:34 crc kubenswrapper[4848]: E1206 15:47:34.874747 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f2ab4c-f787-4a9c-acce-12e31518edcc" containerName="mariadb-database-create" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874753 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f2ab4c-f787-4a9c-acce-12e31518edcc" containerName="mariadb-database-create" Dec 06 15:47:34 crc kubenswrapper[4848]: E1206 15:47:34.874765 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242c04cd-53ea-4612-bc7d-e7e7d05700ed" containerName="mariadb-account-create-update" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874771 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="242c04cd-53ea-4612-bc7d-e7e7d05700ed" containerName="mariadb-account-create-update" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874922 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5314397f-9320-40b0-b18d-e9697cd268c8" containerName="ovn-config" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874940 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd" containerName="mariadb-account-create-update" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874954 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eae0c0b-0b9e-4fc7-b9b6-21b14900096d" containerName="mariadb-database-create" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874964 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="242c04cd-53ea-4612-bc7d-e7e7d05700ed" containerName="mariadb-account-create-update" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874978 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="58460aea-6f24-4346-b46d-7e0a7e0e0eca" containerName="mariadb-account-create-update" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.874993 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f2ab4c-f787-4a9c-acce-12e31518edcc" containerName="mariadb-database-create" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.875001 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4a5fd6-456b-4eb4-973a-a9cf690e9be8" containerName="mariadb-database-create" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.875877 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.879171 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.892980 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-x7nkg"] Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.965351 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.965401 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-config\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.965427 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h22r\" (UniqueName: \"kubernetes.io/projected/dcd7c3ec-b278-41c6-a601-86db39696c8f-kube-api-access-8h22r\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.965447 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.965473 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:34 crc kubenswrapper[4848]: I1206 15:47:34.965540 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.067202 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.067256 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-config\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.067284 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h22r\" (UniqueName: \"kubernetes.io/projected/dcd7c3ec-b278-41c6-a601-86db39696c8f-kube-api-access-8h22r\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.067308 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.067365 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.067406 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.069321 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.069954 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-config\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.070768 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.071533 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.073302 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.093072 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h22r\" (UniqueName: \"kubernetes.io/projected/dcd7c3ec-b278-41c6-a601-86db39696c8f-kube-api-access-8h22r\") pod \"dnsmasq-dns-6d5b6d6b67-x7nkg\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.193528 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:35 crc kubenswrapper[4848]: I1206 15:47:35.633937 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-x7nkg"] Dec 06 15:47:36 crc kubenswrapper[4848]: I1206 15:47:36.594532 4848 generic.go:334] "Generic (PLEG): container finished" podID="dcd7c3ec-b278-41c6-a601-86db39696c8f" containerID="c1c1673c71ba1173a83674fcef908b88e679e316f5db39671f872a87fb5fc5db" exitCode=0 Dec 06 15:47:36 crc kubenswrapper[4848]: I1206 15:47:36.594819 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" event={"ID":"dcd7c3ec-b278-41c6-a601-86db39696c8f","Type":"ContainerDied","Data":"c1c1673c71ba1173a83674fcef908b88e679e316f5db39671f872a87fb5fc5db"} Dec 06 15:47:36 crc kubenswrapper[4848]: I1206 15:47:36.595079 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" event={"ID":"dcd7c3ec-b278-41c6-a601-86db39696c8f","Type":"ContainerStarted","Data":"1c13a4f26544d3e24134f8790fe1e1caf9e140a86abd70dd1be892168b0eaf54"} Dec 06 15:47:37 crc kubenswrapper[4848]: I1206 15:47:37.606002 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" event={"ID":"dcd7c3ec-b278-41c6-a601-86db39696c8f","Type":"ContainerStarted","Data":"08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a"} Dec 06 15:47:37 crc kubenswrapper[4848]: I1206 15:47:37.607652 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:37 crc kubenswrapper[4848]: I1206 15:47:37.630436 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" podStartSLOduration=3.630421901 podStartE2EDuration="3.630421901s" podCreationTimestamp="2025-12-06 15:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:47:37.628756216 +0000 UTC m=+1124.926767129" watchObservedRunningTime="2025-12-06 15:47:37.630421901 +0000 UTC m=+1124.928432814" Dec 06 15:47:41 crc kubenswrapper[4848]: I1206 15:47:41.655401 4848 generic.go:334] "Generic (PLEG): container finished" podID="7a892933-78a1-49d3-abee-8249ee831464" containerID="c6081ff52d7a92534d5cd0e5ad138d067d5291ef1ffce09cd5888834425209f1" exitCode=0 Dec 06 15:47:41 crc kubenswrapper[4848]: I1206 15:47:41.655509 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p6tct" event={"ID":"7a892933-78a1-49d3-abee-8249ee831464","Type":"ContainerDied","Data":"c6081ff52d7a92534d5cd0e5ad138d067d5291ef1ffce09cd5888834425209f1"} Dec 06 15:47:42 crc kubenswrapper[4848]: I1206 15:47:42.964984 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.092510 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtkmm\" (UniqueName: \"kubernetes.io/projected/7a892933-78a1-49d3-abee-8249ee831464-kube-api-access-xtkmm\") pod \"7a892933-78a1-49d3-abee-8249ee831464\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.092851 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-combined-ca-bundle\") pod \"7a892933-78a1-49d3-abee-8249ee831464\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.092988 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-config-data\") pod \"7a892933-78a1-49d3-abee-8249ee831464\" (UID: \"7a892933-78a1-49d3-abee-8249ee831464\") " Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.098379 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a892933-78a1-49d3-abee-8249ee831464-kube-api-access-xtkmm" (OuterVolumeSpecName: "kube-api-access-xtkmm") pod "7a892933-78a1-49d3-abee-8249ee831464" (UID: "7a892933-78a1-49d3-abee-8249ee831464"). InnerVolumeSpecName "kube-api-access-xtkmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.118947 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a892933-78a1-49d3-abee-8249ee831464" (UID: "7a892933-78a1-49d3-abee-8249ee831464"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.149832 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-config-data" (OuterVolumeSpecName: "config-data") pod "7a892933-78a1-49d3-abee-8249ee831464" (UID: "7a892933-78a1-49d3-abee-8249ee831464"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.195642 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.195718 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a892933-78a1-49d3-abee-8249ee831464-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.195733 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtkmm\" (UniqueName: \"kubernetes.io/projected/7a892933-78a1-49d3-abee-8249ee831464-kube-api-access-xtkmm\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.672755 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p6tct" event={"ID":"7a892933-78a1-49d3-abee-8249ee831464","Type":"ContainerDied","Data":"0542c163a329e2312b14975538bfd8f9238593cc3342f90af0f26d12ce0d6f87"} Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.673065 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0542c163a329e2312b14975538bfd8f9238593cc3342f90af0f26d12ce0d6f87" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.672822 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p6tct" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.921862 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-x7nkg"] Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.922128 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" podUID="dcd7c3ec-b278-41c6-a601-86db39696c8f" containerName="dnsmasq-dns" containerID="cri-o://08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a" gracePeriod=10 Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.924189 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.955984 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-82tzj"] Dec 06 15:47:43 crc kubenswrapper[4848]: E1206 15:47:43.956425 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a892933-78a1-49d3-abee-8249ee831464" containerName="keystone-db-sync" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.956446 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a892933-78a1-49d3-abee-8249ee831464" containerName="keystone-db-sync" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.956687 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a892933-78a1-49d3-abee-8249ee831464" containerName="keystone-db-sync" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.957822 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.990369 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-v5rxm"] Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.991429 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.994305 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.994621 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-s6twx" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.995070 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.996975 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.997199 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 15:47:43 crc kubenswrapper[4848]: I1206 15:47:43.997445 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-82tzj"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.009348 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.009648 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.009825 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-config\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.010012 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.010139 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdklj\" (UniqueName: \"kubernetes.io/projected/53bcaa86-5455-4b12-b920-931d4aa00170-kube-api-access-vdklj\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.010234 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.055984 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-v5rxm"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113217 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-config-data\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113301 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-combined-ca-bundle\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113336 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113359 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-scripts\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113396 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt9lq\" (UniqueName: \"kubernetes.io/projected/f1107f25-5755-4061-9f94-711281b2c74a-kube-api-access-tt9lq\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113426 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-fernet-keys\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113446 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113475 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-config\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113519 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-credential-keys\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113561 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113615 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdklj\" (UniqueName: \"kubernetes.io/projected/53bcaa86-5455-4b12-b920-931d4aa00170-kube-api-access-vdklj\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.113642 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.114988 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.115672 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.116329 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.116972 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-config\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.119380 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.155595 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdklj\" (UniqueName: \"kubernetes.io/projected/53bcaa86-5455-4b12-b920-931d4aa00170-kube-api-access-vdklj\") pod \"dnsmasq-dns-6f8c45789f-82tzj\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.158666 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-9w7v4"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.159810 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9w7v4" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.179100 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-s6zql"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.180335 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.188299 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.188542 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-b5tjg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.188681 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.191904 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-9w7v4"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.205725 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s6zql"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.215930 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-scripts\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.215969 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-config-data\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.215992 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26q7h\" (UniqueName: \"kubernetes.io/projected/14d771cf-7f6f-474d-bfed-6e48e6deca38-kube-api-access-26q7h\") pod \"ironic-db-create-9w7v4\" (UID: \"14d771cf-7f6f-474d-bfed-6e48e6deca38\") " pod="openstack/ironic-db-create-9w7v4" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.216028 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-combined-ca-bundle\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.216048 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-scripts\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.216078 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kncck\" (UniqueName: \"kubernetes.io/projected/bb8d9713-c9fb-42c1-8496-03e949d82d8e-kube-api-access-kncck\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.216097 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-db-sync-config-data\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.216122 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb8d9713-c9fb-42c1-8496-03e949d82d8e-etc-machine-id\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.216295 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt9lq\" (UniqueName: \"kubernetes.io/projected/f1107f25-5755-4061-9f94-711281b2c74a-kube-api-access-tt9lq\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.216320 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-fernet-keys\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.216346 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14d771cf-7f6f-474d-bfed-6e48e6deca38-operator-scripts\") pod \"ironic-db-create-9w7v4\" (UID: \"14d771cf-7f6f-474d-bfed-6e48e6deca38\") " pod="openstack/ironic-db-create-9w7v4" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.216374 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-combined-ca-bundle\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.216400 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-credential-keys\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.216447 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-config-data\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.251772 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.254478 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.256742 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-config-data\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.261658 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.261892 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.262243 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-scripts\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.268257 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-combined-ca-bundle\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.269845 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-fernet-keys\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.269857 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-credential-keys\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.280338 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt9lq\" (UniqueName: \"kubernetes.io/projected/f1107f25-5755-4061-9f94-711281b2c74a-kube-api-access-tt9lq\") pod \"keystone-bootstrap-v5rxm\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.297196 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.298728 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-rfkvg"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.299667 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.303406 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.303751 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k4486" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.303930 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.316857 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rfkvg"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.319834 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-log-httpd\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.319883 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14d771cf-7f6f-474d-bfed-6e48e6deca38-operator-scripts\") pod \"ironic-db-create-9w7v4\" (UID: \"14d771cf-7f6f-474d-bfed-6e48e6deca38\") " pod="openstack/ironic-db-create-9w7v4" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.319916 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-scripts\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.319936 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-combined-ca-bundle\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.319986 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-run-httpd\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.320003 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-config-data\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.320037 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-scripts\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.320055 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26q7h\" (UniqueName: \"kubernetes.io/projected/14d771cf-7f6f-474d-bfed-6e48e6deca38-kube-api-access-26q7h\") pod \"ironic-db-create-9w7v4\" (UID: \"14d771cf-7f6f-474d-bfed-6e48e6deca38\") " pod="openstack/ironic-db-create-9w7v4" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.320082 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.320102 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.320115 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-config-data\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.320142 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kncck\" (UniqueName: \"kubernetes.io/projected/bb8d9713-c9fb-42c1-8496-03e949d82d8e-kube-api-access-kncck\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.320159 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-db-sync-config-data\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.320185 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb8d9713-c9fb-42c1-8496-03e949d82d8e-etc-machine-id\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.320202 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq9xt\" (UniqueName: \"kubernetes.io/projected/a05e5db4-33a2-403d-b7ba-8e70207374ae-kube-api-access-wq9xt\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.320844 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14d771cf-7f6f-474d-bfed-6e48e6deca38-operator-scripts\") pod \"ironic-db-create-9w7v4\" (UID: \"14d771cf-7f6f-474d-bfed-6e48e6deca38\") " pod="openstack/ironic-db-create-9w7v4" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.324210 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.334188 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.342239 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-db-sync-config-data\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.342656 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb8d9713-c9fb-42c1-8496-03e949d82d8e-etc-machine-id\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.345993 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-config-data\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.352560 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-scripts\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.354294 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-combined-ca-bundle\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.366580 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-zqt4v"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.367635 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.369857 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.371210 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5ffck" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.371277 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kncck\" (UniqueName: \"kubernetes.io/projected/bb8d9713-c9fb-42c1-8496-03e949d82d8e-kube-api-access-kncck\") pod \"cinder-db-sync-s6zql\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.397940 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26q7h\" (UniqueName: \"kubernetes.io/projected/14d771cf-7f6f-474d-bfed-6e48e6deca38-kube-api-access-26q7h\") pod \"ironic-db-create-9w7v4\" (UID: \"14d771cf-7f6f-474d-bfed-6e48e6deca38\") " pod="openstack/ironic-db-create-9w7v4" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.399665 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-6625-account-create-update-7bnhs"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.400789 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6625-account-create-update-7bnhs" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.402834 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.421891 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-db-sync-config-data\") pod \"barbican-db-sync-zqt4v\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.421955 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnbrb\" (UniqueName: \"kubernetes.io/projected/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-kube-api-access-vnbrb\") pod \"neutron-db-sync-rfkvg\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.421980 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xbd6\" (UniqueName: \"kubernetes.io/projected/b4758d52-e17c-484e-96a3-4879daace03e-kube-api-access-4xbd6\") pod \"barbican-db-sync-zqt4v\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.422007 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.422025 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.422210 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-combined-ca-bundle\") pod \"neutron-db-sync-rfkvg\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.422238 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-config-data\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.422266 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-config\") pod \"neutron-db-sync-rfkvg\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.422317 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq9xt\" (UniqueName: \"kubernetes.io/projected/a05e5db4-33a2-403d-b7ba-8e70207374ae-kube-api-access-wq9xt\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.422346 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-log-httpd\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.422374 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-scripts\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.422430 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-run-httpd\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.422463 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-combined-ca-bundle\") pod \"barbican-db-sync-zqt4v\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.423557 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-run-httpd\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.424824 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-log-httpd\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.428506 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-scripts\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.439181 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-config-data\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.439900 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zqt4v"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.440390 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.446361 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.460598 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq9xt\" (UniqueName: \"kubernetes.io/projected/a05e5db4-33a2-403d-b7ba-8e70207374ae-kube-api-access-wq9xt\") pod \"ceilometer-0\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.460657 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-6625-account-create-update-7bnhs"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.510140 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-82tzj"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.510206 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-hkpld"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.511519 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.522472 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9w7v4" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.523735 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29032f23-2ae4-4ec4-8a4c-8f647d120974-operator-scripts\") pod \"ironic-6625-account-create-update-7bnhs\" (UID: \"29032f23-2ae4-4ec4-8a4c-8f647d120974\") " pod="openstack/ironic-6625-account-create-update-7bnhs" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.529749 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-combined-ca-bundle\") pod \"barbican-db-sync-zqt4v\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.529816 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86464\" (UniqueName: \"kubernetes.io/projected/29032f23-2ae4-4ec4-8a4c-8f647d120974-kube-api-access-86464\") pod \"ironic-6625-account-create-update-7bnhs\" (UID: \"29032f23-2ae4-4ec4-8a4c-8f647d120974\") " pod="openstack/ironic-6625-account-create-update-7bnhs" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.529860 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-db-sync-config-data\") pod \"barbican-db-sync-zqt4v\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.529902 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnbrb\" (UniqueName: \"kubernetes.io/projected/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-kube-api-access-vnbrb\") pod \"neutron-db-sync-rfkvg\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.529945 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbd6\" (UniqueName: \"kubernetes.io/projected/b4758d52-e17c-484e-96a3-4879daace03e-kube-api-access-4xbd6\") pod \"barbican-db-sync-zqt4v\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.530013 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-combined-ca-bundle\") pod \"neutron-db-sync-rfkvg\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.530062 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-config\") pod \"neutron-db-sync-rfkvg\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.533617 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-combined-ca-bundle\") pod \"barbican-db-sync-zqt4v\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.570008 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-config\") pod \"neutron-db-sync-rfkvg\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.570098 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-hkpld"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.570240 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-combined-ca-bundle\") pod \"neutron-db-sync-rfkvg\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.585439 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-db-sync-config-data\") pod \"barbican-db-sync-zqt4v\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.586365 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xbd6\" (UniqueName: \"kubernetes.io/projected/b4758d52-e17c-484e-96a3-4879daace03e-kube-api-access-4xbd6\") pod \"barbican-db-sync-zqt4v\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.588361 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnbrb\" (UniqueName: \"kubernetes.io/projected/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-kube-api-access-vnbrb\") pod \"neutron-db-sync-rfkvg\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.611319 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-f882q"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.619505 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.642630 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29032f23-2ae4-4ec4-8a4c-8f647d120974-operator-scripts\") pod \"ironic-6625-account-create-update-7bnhs\" (UID: \"29032f23-2ae4-4ec4-8a4c-8f647d120974\") " pod="openstack/ironic-6625-account-create-update-7bnhs" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.642833 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.642864 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdrfj\" (UniqueName: \"kubernetes.io/projected/581ce2b8-5ed1-410e-ab1d-f860e76b8546-kube-api-access-kdrfj\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.642947 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.642991 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-config\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.643086 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.643126 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86464\" (UniqueName: \"kubernetes.io/projected/29032f23-2ae4-4ec4-8a4c-8f647d120974-kube-api-access-86464\") pod \"ironic-6625-account-create-update-7bnhs\" (UID: \"29032f23-2ae4-4ec4-8a4c-8f647d120974\") " pod="openstack/ironic-6625-account-create-update-7bnhs" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.643189 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.643992 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29032f23-2ae4-4ec4-8a4c-8f647d120974-operator-scripts\") pod \"ironic-6625-account-create-update-7bnhs\" (UID: \"29032f23-2ae4-4ec4-8a4c-8f647d120974\") " pod="openstack/ironic-6625-account-create-update-7bnhs" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.644371 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.644928 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.645097 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xx9dj" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.647567 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-f882q"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.658712 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s6zql" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.666091 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86464\" (UniqueName: \"kubernetes.io/projected/29032f23-2ae4-4ec4-8a4c-8f647d120974-kube-api-access-86464\") pod \"ironic-6625-account-create-update-7bnhs\" (UID: \"29032f23-2ae4-4ec4-8a4c-8f647d120974\") " pod="openstack/ironic-6625-account-create-update-7bnhs" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.674485 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.675182 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.717070 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.722795 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.727617 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6625-account-create-update-7bnhs" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.729325 4848 generic.go:334] "Generic (PLEG): container finished" podID="dcd7c3ec-b278-41c6-a601-86db39696c8f" containerID="08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a" exitCode=0 Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.729360 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" event={"ID":"dcd7c3ec-b278-41c6-a601-86db39696c8f","Type":"ContainerDied","Data":"08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a"} Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.729384 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" event={"ID":"dcd7c3ec-b278-41c6-a601-86db39696c8f","Type":"ContainerDied","Data":"1c13a4f26544d3e24134f8790fe1e1caf9e140a86abd70dd1be892168b0eaf54"} Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.729396 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-x7nkg" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.729399 4848 scope.go:117] "RemoveContainer" containerID="08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744312 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-nb\") pod \"dcd7c3ec-b278-41c6-a601-86db39696c8f\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744358 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-config\") pod \"dcd7c3ec-b278-41c6-a601-86db39696c8f\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744386 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-svc\") pod \"dcd7c3ec-b278-41c6-a601-86db39696c8f\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744422 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-swift-storage-0\") pod \"dcd7c3ec-b278-41c6-a601-86db39696c8f\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744474 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h22r\" (UniqueName: \"kubernetes.io/projected/dcd7c3ec-b278-41c6-a601-86db39696c8f-kube-api-access-8h22r\") pod \"dcd7c3ec-b278-41c6-a601-86db39696c8f\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744550 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-sb\") pod \"dcd7c3ec-b278-41c6-a601-86db39696c8f\" (UID: \"dcd7c3ec-b278-41c6-a601-86db39696c8f\") " Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744766 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744791 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-scripts\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744820 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-config\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744840 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-combined-ca-bundle\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744881 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744904 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-config-data\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744942 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.744959 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46997064-cc24-406e-8971-0cdbad196707-logs\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.745010 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.745030 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdrfj\" (UniqueName: \"kubernetes.io/projected/581ce2b8-5ed1-410e-ab1d-f860e76b8546-kube-api-access-kdrfj\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.745057 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8qrw\" (UniqueName: \"kubernetes.io/projected/46997064-cc24-406e-8971-0cdbad196707-kube-api-access-x8qrw\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.746303 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.746868 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.747368 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.748337 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-config\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.749093 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.755641 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd7c3ec-b278-41c6-a601-86db39696c8f-kube-api-access-8h22r" (OuterVolumeSpecName: "kube-api-access-8h22r") pod "dcd7c3ec-b278-41c6-a601-86db39696c8f" (UID: "dcd7c3ec-b278-41c6-a601-86db39696c8f"). InnerVolumeSpecName "kube-api-access-8h22r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.768608 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdrfj\" (UniqueName: \"kubernetes.io/projected/581ce2b8-5ed1-410e-ab1d-f860e76b8546-kube-api-access-kdrfj\") pod \"dnsmasq-dns-fcfdd6f9f-hkpld\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.838171 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.847010 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8qrw\" (UniqueName: \"kubernetes.io/projected/46997064-cc24-406e-8971-0cdbad196707-kube-api-access-x8qrw\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.847363 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-scripts\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.847412 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-combined-ca-bundle\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.847486 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-config-data\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.847547 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46997064-cc24-406e-8971-0cdbad196707-logs\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.847604 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h22r\" (UniqueName: \"kubernetes.io/projected/dcd7c3ec-b278-41c6-a601-86db39696c8f-kube-api-access-8h22r\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.848009 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46997064-cc24-406e-8971-0cdbad196707-logs\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.852578 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dcd7c3ec-b278-41c6-a601-86db39696c8f" (UID: "dcd7c3ec-b278-41c6-a601-86db39696c8f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.866095 4848 scope.go:117] "RemoveContainer" containerID="c1c1673c71ba1173a83674fcef908b88e679e316f5db39671f872a87fb5fc5db" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.872278 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-combined-ca-bundle\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.872366 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-scripts\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.874191 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-config-data\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.879918 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dcd7c3ec-b278-41c6-a601-86db39696c8f" (UID: "dcd7c3ec-b278-41c6-a601-86db39696c8f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.879951 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8qrw\" (UniqueName: \"kubernetes.io/projected/46997064-cc24-406e-8971-0cdbad196707-kube-api-access-x8qrw\") pod \"placement-db-sync-f882q\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.884969 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dcd7c3ec-b278-41c6-a601-86db39696c8f" (UID: "dcd7c3ec-b278-41c6-a601-86db39696c8f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.885580 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dcd7c3ec-b278-41c6-a601-86db39696c8f" (UID: "dcd7c3ec-b278-41c6-a601-86db39696c8f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.913018 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-config" (OuterVolumeSpecName: "config") pod "dcd7c3ec-b278-41c6-a601-86db39696c8f" (UID: "dcd7c3ec-b278-41c6-a601-86db39696c8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.948927 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.948962 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.948971 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.948980 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.948990 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcd7c3ec-b278-41c6-a601-86db39696c8f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.951286 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-v5rxm"] Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.963679 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f882q" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.978367 4848 scope.go:117] "RemoveContainer" containerID="08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a" Dec 06 15:47:44 crc kubenswrapper[4848]: E1206 15:47:44.985438 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a\": container with ID starting with 08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a not found: ID does not exist" containerID="08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.985479 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a"} err="failed to get container status \"08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a\": rpc error: code = NotFound desc = could not find container \"08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a\": container with ID starting with 08a86581fc8dd77c1ef384235787f7faa93f10da68f3414dcb0466ad5da7c77a not found: ID does not exist" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.985504 4848 scope.go:117] "RemoveContainer" containerID="c1c1673c71ba1173a83674fcef908b88e679e316f5db39671f872a87fb5fc5db" Dec 06 15:47:44 crc kubenswrapper[4848]: E1206 15:47:44.988585 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1c1673c71ba1173a83674fcef908b88e679e316f5db39671f872a87fb5fc5db\": container with ID starting with c1c1673c71ba1173a83674fcef908b88e679e316f5db39671f872a87fb5fc5db not found: ID does not exist" containerID="c1c1673c71ba1173a83674fcef908b88e679e316f5db39671f872a87fb5fc5db" Dec 06 15:47:44 crc kubenswrapper[4848]: I1206 15:47:44.988634 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c1673c71ba1173a83674fcef908b88e679e316f5db39671f872a87fb5fc5db"} err="failed to get container status \"c1c1673c71ba1173a83674fcef908b88e679e316f5db39671f872a87fb5fc5db\": rpc error: code = NotFound desc = could not find container \"c1c1673c71ba1173a83674fcef908b88e679e316f5db39671f872a87fb5fc5db\": container with ID starting with c1c1673c71ba1173a83674fcef908b88e679e316f5db39671f872a87fb5fc5db not found: ID does not exist" Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.106999 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-82tzj"] Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.216259 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-9w7v4"] Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.409208 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-x7nkg"] Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.425338 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-x7nkg"] Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.435233 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.449192 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s6zql"] Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.567659 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rfkvg"] Dec 06 15:47:45 crc kubenswrapper[4848]: W1206 15:47:45.595251 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4758d52_e17c_484e_96a3_4879daace03e.slice/crio-996b6d7660ee46bb3609b4d3bae0fa3aeb44d8786132ec3f963359ba1b28fe59 WatchSource:0}: Error finding container 996b6d7660ee46bb3609b4d3bae0fa3aeb44d8786132ec3f963359ba1b28fe59: Status 404 returned error can't find the container with id 996b6d7660ee46bb3609b4d3bae0fa3aeb44d8786132ec3f963359ba1b28fe59 Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.600805 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zqt4v"] Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.747628 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-6625-account-create-update-7bnhs"] Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.754246 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-hkpld"] Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.762058 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-9w7v4" event={"ID":"14d771cf-7f6f-474d-bfed-6e48e6deca38","Type":"ContainerStarted","Data":"4db295b48c9b56a1a8fbbe8f6b11262de41080ea1b4b42e54fc1046dbb11da7d"} Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.763612 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" event={"ID":"53bcaa86-5455-4b12-b920-931d4aa00170","Type":"ContainerStarted","Data":"6a3fa6294f7f57294b1a6f043b4a9c7b647ae856bef7772958795aa7816e5280"} Dec 06 15:47:45 crc kubenswrapper[4848]: W1206 15:47:45.765088 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod581ce2b8_5ed1_410e_ab1d_f860e76b8546.slice/crio-e006201403a23cfb6ccfd2e9b6a555f9b59cab7da4669f80b229c2b3cbd63956 WatchSource:0}: Error finding container e006201403a23cfb6ccfd2e9b6a555f9b59cab7da4669f80b229c2b3cbd63956: Status 404 returned error can't find the container with id e006201403a23cfb6ccfd2e9b6a555f9b59cab7da4669f80b229c2b3cbd63956 Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.767003 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s6zql" event={"ID":"bb8d9713-c9fb-42c1-8496-03e949d82d8e","Type":"ContainerStarted","Data":"710a275d993a5ddc2e08ba6406dc28dadeea12914faeb84d0e8748cf7d04e15e"} Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.770599 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a05e5db4-33a2-403d-b7ba-8e70207374ae","Type":"ContainerStarted","Data":"6f5502164c2e7bc140eaa1ca782f5af324b8e8a81bf68272b852cf737ac8f1d1"} Dec 06 15:47:45 crc kubenswrapper[4848]: W1206 15:47:45.772408 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29032f23_2ae4_4ec4_8a4c_8f647d120974.slice/crio-bf2f599832362b37e6c6aff40143ab4995e0be7723a0b3d6e9c3c10a3b6a1258 WatchSource:0}: Error finding container bf2f599832362b37e6c6aff40143ab4995e0be7723a0b3d6e9c3c10a3b6a1258: Status 404 returned error can't find the container with id bf2f599832362b37e6c6aff40143ab4995e0be7723a0b3d6e9c3c10a3b6a1258 Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.773561 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v5rxm" event={"ID":"f1107f25-5755-4061-9f94-711281b2c74a","Type":"ContainerStarted","Data":"0df667375cb8fca8e19b21d5f9fcca0606484e379b8a172d2c3438f2a286a907"} Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.775012 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rfkvg" event={"ID":"dc0186b0-9bb6-401b-bec2-80ee1058b4e8","Type":"ContainerStarted","Data":"10d12c7c55595425ecac5750149eff17b92fc147fd678f5d6e4081a582ed07bd"} Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.778113 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zqt4v" event={"ID":"b4758d52-e17c-484e-96a3-4879daace03e","Type":"ContainerStarted","Data":"996b6d7660ee46bb3609b4d3bae0fa3aeb44d8786132ec3f963359ba1b28fe59"} Dec 06 15:47:45 crc kubenswrapper[4848]: I1206 15:47:45.897200 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-f882q"] Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.227220 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.791129 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f882q" event={"ID":"46997064-cc24-406e-8971-0cdbad196707","Type":"ContainerStarted","Data":"f27bf0f28bb1caf112739b385cb8c869fdda954d189f4627d8ea53adad1117c2"} Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.795136 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v5rxm" event={"ID":"f1107f25-5755-4061-9f94-711281b2c74a","Type":"ContainerStarted","Data":"39483f16e27cf26851cbe8c6f30260590a5b9f2670add6ad8922d4fd4ba4bd0f"} Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.798109 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rfkvg" event={"ID":"dc0186b0-9bb6-401b-bec2-80ee1058b4e8","Type":"ContainerStarted","Data":"1d28c15762b6288c63e38c8b80841f096bf6cb44b59512f1451441ff95905739"} Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.801421 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-9w7v4" event={"ID":"14d771cf-7f6f-474d-bfed-6e48e6deca38","Type":"ContainerDied","Data":"ba6d1ef608a1297fe15ddb76c6dba0807ef21db42c5af25d2b5d13fc8d687077"} Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.801283 4848 generic.go:334] "Generic (PLEG): container finished" podID="14d771cf-7f6f-474d-bfed-6e48e6deca38" containerID="ba6d1ef608a1297fe15ddb76c6dba0807ef21db42c5af25d2b5d13fc8d687077" exitCode=0 Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.810277 4848 generic.go:334] "Generic (PLEG): container finished" podID="29032f23-2ae4-4ec4-8a4c-8f647d120974" containerID="ef4659fc99ee6cac30b751a280ea6d3c224353e1179e2cbca40da0d0ef12143d" exitCode=0 Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.812252 4848 generic.go:334] "Generic (PLEG): container finished" podID="581ce2b8-5ed1-410e-ab1d-f860e76b8546" containerID="01b9dc8d1f8eaf4f3a52e0505b230b5656877b3e5eb11f25409895692c31e645" exitCode=0 Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.810241 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6625-account-create-update-7bnhs" event={"ID":"29032f23-2ae4-4ec4-8a4c-8f647d120974","Type":"ContainerDied","Data":"ef4659fc99ee6cac30b751a280ea6d3c224353e1179e2cbca40da0d0ef12143d"} Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.815581 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6625-account-create-update-7bnhs" event={"ID":"29032f23-2ae4-4ec4-8a4c-8f647d120974","Type":"ContainerStarted","Data":"bf2f599832362b37e6c6aff40143ab4995e0be7723a0b3d6e9c3c10a3b6a1258"} Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.816557 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" event={"ID":"581ce2b8-5ed1-410e-ab1d-f860e76b8546","Type":"ContainerDied","Data":"01b9dc8d1f8eaf4f3a52e0505b230b5656877b3e5eb11f25409895692c31e645"} Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.816585 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" event={"ID":"581ce2b8-5ed1-410e-ab1d-f860e76b8546","Type":"ContainerStarted","Data":"e006201403a23cfb6ccfd2e9b6a555f9b59cab7da4669f80b229c2b3cbd63956"} Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.820626 4848 generic.go:334] "Generic (PLEG): container finished" podID="e7b51e33-73e3-4dc5-83a7-fcbd0cc69930" containerID="ce0e406f279064825e9529bec9f70f2aa4138cb90d6c059852d844bef90c3485" exitCode=0 Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.820753 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l4vmt" event={"ID":"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930","Type":"ContainerDied","Data":"ce0e406f279064825e9529bec9f70f2aa4138cb90d6c059852d844bef90c3485"} Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.821833 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-v5rxm" podStartSLOduration=3.8217926220000003 podStartE2EDuration="3.821792622s" podCreationTimestamp="2025-12-06 15:47:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:47:46.812252683 +0000 UTC m=+1134.110263596" watchObservedRunningTime="2025-12-06 15:47:46.821792622 +0000 UTC m=+1134.119803535" Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.822886 4848 generic.go:334] "Generic (PLEG): container finished" podID="53bcaa86-5455-4b12-b920-931d4aa00170" containerID="5a6bc7b4353e71dbb7b5a1223be2f2f4b63cc7af16bee94894b587cbdc075766" exitCode=0 Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.822933 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" event={"ID":"53bcaa86-5455-4b12-b920-931d4aa00170","Type":"ContainerDied","Data":"5a6bc7b4353e71dbb7b5a1223be2f2f4b63cc7af16bee94894b587cbdc075766"} Dec 06 15:47:46 crc kubenswrapper[4848]: I1206 15:47:46.911095 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-rfkvg" podStartSLOduration=2.911071871 podStartE2EDuration="2.911071871s" podCreationTimestamp="2025-12-06 15:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:47:46.855139766 +0000 UTC m=+1134.153150679" watchObservedRunningTime="2025-12-06 15:47:46.911071871 +0000 UTC m=+1134.209082784" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.062715 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd7c3ec-b278-41c6-a601-86db39696c8f" path="/var/lib/kubelet/pods/dcd7c3ec-b278-41c6-a601-86db39696c8f/volumes" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.160644 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.160976 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.161024 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.161833 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7145cee45c506bc8604a623c0766622691ca486056cd069a6687b453e59facaa"} pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.161891 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" containerID="cri-o://7145cee45c506bc8604a623c0766622691ca486056cd069a6687b453e59facaa" gracePeriod=600 Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.333533 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.493193 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-swift-storage-0\") pod \"53bcaa86-5455-4b12-b920-931d4aa00170\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.493286 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-config\") pod \"53bcaa86-5455-4b12-b920-931d4aa00170\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.493325 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-nb\") pod \"53bcaa86-5455-4b12-b920-931d4aa00170\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.493387 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-svc\") pod \"53bcaa86-5455-4b12-b920-931d4aa00170\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.493431 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-sb\") pod \"53bcaa86-5455-4b12-b920-931d4aa00170\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.493456 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdklj\" (UniqueName: \"kubernetes.io/projected/53bcaa86-5455-4b12-b920-931d4aa00170-kube-api-access-vdklj\") pod \"53bcaa86-5455-4b12-b920-931d4aa00170\" (UID: \"53bcaa86-5455-4b12-b920-931d4aa00170\") " Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.509844 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53bcaa86-5455-4b12-b920-931d4aa00170-kube-api-access-vdklj" (OuterVolumeSpecName: "kube-api-access-vdklj") pod "53bcaa86-5455-4b12-b920-931d4aa00170" (UID: "53bcaa86-5455-4b12-b920-931d4aa00170"). InnerVolumeSpecName "kube-api-access-vdklj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.545814 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "53bcaa86-5455-4b12-b920-931d4aa00170" (UID: "53bcaa86-5455-4b12-b920-931d4aa00170"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.547676 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-config" (OuterVolumeSpecName: "config") pod "53bcaa86-5455-4b12-b920-931d4aa00170" (UID: "53bcaa86-5455-4b12-b920-931d4aa00170"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.550209 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53bcaa86-5455-4b12-b920-931d4aa00170" (UID: "53bcaa86-5455-4b12-b920-931d4aa00170"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.550986 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53bcaa86-5455-4b12-b920-931d4aa00170" (UID: "53bcaa86-5455-4b12-b920-931d4aa00170"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.562088 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53bcaa86-5455-4b12-b920-931d4aa00170" (UID: "53bcaa86-5455-4b12-b920-931d4aa00170"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.595523 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.595558 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.595568 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.595577 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.595585 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53bcaa86-5455-4b12-b920-931d4aa00170-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.595594 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdklj\" (UniqueName: \"kubernetes.io/projected/53bcaa86-5455-4b12-b920-931d4aa00170-kube-api-access-vdklj\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.853258 4848 generic.go:334] "Generic (PLEG): container finished" podID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerID="7145cee45c506bc8604a623c0766622691ca486056cd069a6687b453e59facaa" exitCode=0 Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.853327 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerDied","Data":"7145cee45c506bc8604a623c0766622691ca486056cd069a6687b453e59facaa"} Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.853354 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"1a1d1fbb58852277f10718bb790d5a1cff7eb412840195878f28ff1bcf501416"} Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.853371 4848 scope.go:117] "RemoveContainer" containerID="0c6dce4a805c82f5f7db3f50f5c57941411fd68b7c39c5fc92171551376370cc" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.860115 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" event={"ID":"581ce2b8-5ed1-410e-ab1d-f860e76b8546","Type":"ContainerStarted","Data":"2ca8d770aee264d2325b5eea9c3cfd3d849ec3e0c88d92ba70ab1d307521eb1d"} Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.860251 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.865586 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" event={"ID":"53bcaa86-5455-4b12-b920-931d4aa00170","Type":"ContainerDied","Data":"6a3fa6294f7f57294b1a6f043b4a9c7b647ae856bef7772958795aa7816e5280"} Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.865649 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-82tzj" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.896170 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" podStartSLOduration=3.896155062 podStartE2EDuration="3.896155062s" podCreationTimestamp="2025-12-06 15:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:47:47.888897396 +0000 UTC m=+1135.186908309" watchObservedRunningTime="2025-12-06 15:47:47.896155062 +0000 UTC m=+1135.194165975" Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.938143 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-82tzj"] Dec 06 15:47:47 crc kubenswrapper[4848]: I1206 15:47:47.950359 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-82tzj"] Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.373126 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9w7v4" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.494902 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.509483 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6625-account-create-update-7bnhs" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.516933 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26q7h\" (UniqueName: \"kubernetes.io/projected/14d771cf-7f6f-474d-bfed-6e48e6deca38-kube-api-access-26q7h\") pod \"14d771cf-7f6f-474d-bfed-6e48e6deca38\" (UID: \"14d771cf-7f6f-474d-bfed-6e48e6deca38\") " Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.517160 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14d771cf-7f6f-474d-bfed-6e48e6deca38-operator-scripts\") pod \"14d771cf-7f6f-474d-bfed-6e48e6deca38\" (UID: \"14d771cf-7f6f-474d-bfed-6e48e6deca38\") " Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.518210 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d771cf-7f6f-474d-bfed-6e48e6deca38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14d771cf-7f6f-474d-bfed-6e48e6deca38" (UID: "14d771cf-7f6f-474d-bfed-6e48e6deca38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.524336 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14d771cf-7f6f-474d-bfed-6e48e6deca38-kube-api-access-26q7h" (OuterVolumeSpecName: "kube-api-access-26q7h") pod "14d771cf-7f6f-474d-bfed-6e48e6deca38" (UID: "14d771cf-7f6f-474d-bfed-6e48e6deca38"). InnerVolumeSpecName "kube-api-access-26q7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.539271 4848 scope.go:117] "RemoveContainer" containerID="5a6bc7b4353e71dbb7b5a1223be2f2f4b63cc7af16bee94894b587cbdc075766" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.618903 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86464\" (UniqueName: \"kubernetes.io/projected/29032f23-2ae4-4ec4-8a4c-8f647d120974-kube-api-access-86464\") pod \"29032f23-2ae4-4ec4-8a4c-8f647d120974\" (UID: \"29032f23-2ae4-4ec4-8a4c-8f647d120974\") " Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.619026 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-db-sync-config-data\") pod \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.619074 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7nz9\" (UniqueName: \"kubernetes.io/projected/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-kube-api-access-k7nz9\") pod \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.619113 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29032f23-2ae4-4ec4-8a4c-8f647d120974-operator-scripts\") pod \"29032f23-2ae4-4ec4-8a4c-8f647d120974\" (UID: \"29032f23-2ae4-4ec4-8a4c-8f647d120974\") " Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.619604 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-config-data\") pod \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.619649 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-combined-ca-bundle\") pod \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\" (UID: \"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930\") " Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.620657 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14d771cf-7f6f-474d-bfed-6e48e6deca38-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.620711 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26q7h\" (UniqueName: \"kubernetes.io/projected/14d771cf-7f6f-474d-bfed-6e48e6deca38-kube-api-access-26q7h\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.620975 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29032f23-2ae4-4ec4-8a4c-8f647d120974-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29032f23-2ae4-4ec4-8a4c-8f647d120974" (UID: "29032f23-2ae4-4ec4-8a4c-8f647d120974"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.623859 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e7b51e33-73e3-4dc5-83a7-fcbd0cc69930" (UID: "e7b51e33-73e3-4dc5-83a7-fcbd0cc69930"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.624655 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-kube-api-access-k7nz9" (OuterVolumeSpecName: "kube-api-access-k7nz9") pod "e7b51e33-73e3-4dc5-83a7-fcbd0cc69930" (UID: "e7b51e33-73e3-4dc5-83a7-fcbd0cc69930"). InnerVolumeSpecName "kube-api-access-k7nz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.625446 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29032f23-2ae4-4ec4-8a4c-8f647d120974-kube-api-access-86464" (OuterVolumeSpecName: "kube-api-access-86464") pod "29032f23-2ae4-4ec4-8a4c-8f647d120974" (UID: "29032f23-2ae4-4ec4-8a4c-8f647d120974"). InnerVolumeSpecName "kube-api-access-86464". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.646922 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7b51e33-73e3-4dc5-83a7-fcbd0cc69930" (UID: "e7b51e33-73e3-4dc5-83a7-fcbd0cc69930"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.673415 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-config-data" (OuterVolumeSpecName: "config-data") pod "e7b51e33-73e3-4dc5-83a7-fcbd0cc69930" (UID: "e7b51e33-73e3-4dc5-83a7-fcbd0cc69930"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.722359 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29032f23-2ae4-4ec4-8a4c-8f647d120974-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.722404 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.722415 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.722425 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86464\" (UniqueName: \"kubernetes.io/projected/29032f23-2ae4-4ec4-8a4c-8f647d120974-kube-api-access-86464\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.722436 4848 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.722443 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7nz9\" (UniqueName: \"kubernetes.io/projected/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930-kube-api-access-k7nz9\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.876572 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l4vmt" event={"ID":"e7b51e33-73e3-4dc5-83a7-fcbd0cc69930","Type":"ContainerDied","Data":"3641f4e1d1c4641c84197390a7283864a3fb38df1c17091d0c7a2a3ae02f0501"} Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.876614 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3641f4e1d1c4641c84197390a7283864a3fb38df1c17091d0c7a2a3ae02f0501" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.876666 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l4vmt" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.885085 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-9w7v4" event={"ID":"14d771cf-7f6f-474d-bfed-6e48e6deca38","Type":"ContainerDied","Data":"4db295b48c9b56a1a8fbbe8f6b11262de41080ea1b4b42e54fc1046dbb11da7d"} Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.885121 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db295b48c9b56a1a8fbbe8f6b11262de41080ea1b4b42e54fc1046dbb11da7d" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.885171 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9w7v4" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.902199 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6625-account-create-update-7bnhs" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.903220 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6625-account-create-update-7bnhs" event={"ID":"29032f23-2ae4-4ec4-8a4c-8f647d120974","Type":"ContainerDied","Data":"bf2f599832362b37e6c6aff40143ab4995e0be7723a0b3d6e9c3c10a3b6a1258"} Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.903264 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf2f599832362b37e6c6aff40143ab4995e0be7723a0b3d6e9c3c10a3b6a1258" Dec 06 15:47:48 crc kubenswrapper[4848]: I1206 15:47:48.977907 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53bcaa86-5455-4b12-b920-931d4aa00170" path="/var/lib/kubelet/pods/53bcaa86-5455-4b12-b920-931d4aa00170/volumes" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.296463 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-hkpld"] Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.347232 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nk6xz"] Dec 06 15:47:49 crc kubenswrapper[4848]: E1206 15:47:49.347845 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd7c3ec-b278-41c6-a601-86db39696c8f" containerName="init" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.347938 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd7c3ec-b278-41c6-a601-86db39696c8f" containerName="init" Dec 06 15:47:49 crc kubenswrapper[4848]: E1206 15:47:49.348014 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29032f23-2ae4-4ec4-8a4c-8f647d120974" containerName="mariadb-account-create-update" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.348088 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="29032f23-2ae4-4ec4-8a4c-8f647d120974" containerName="mariadb-account-create-update" Dec 06 15:47:49 crc kubenswrapper[4848]: E1206 15:47:49.348149 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bcaa86-5455-4b12-b920-931d4aa00170" containerName="init" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.348200 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bcaa86-5455-4b12-b920-931d4aa00170" containerName="init" Dec 06 15:47:49 crc kubenswrapper[4848]: E1206 15:47:49.348257 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd7c3ec-b278-41c6-a601-86db39696c8f" containerName="dnsmasq-dns" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.348324 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd7c3ec-b278-41c6-a601-86db39696c8f" containerName="dnsmasq-dns" Dec 06 15:47:49 crc kubenswrapper[4848]: E1206 15:47:49.348419 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b51e33-73e3-4dc5-83a7-fcbd0cc69930" containerName="glance-db-sync" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.348506 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b51e33-73e3-4dc5-83a7-fcbd0cc69930" containerName="glance-db-sync" Dec 06 15:47:49 crc kubenswrapper[4848]: E1206 15:47:49.348574 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d771cf-7f6f-474d-bfed-6e48e6deca38" containerName="mariadb-database-create" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.348641 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d771cf-7f6f-474d-bfed-6e48e6deca38" containerName="mariadb-database-create" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.348922 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d771cf-7f6f-474d-bfed-6e48e6deca38" containerName="mariadb-database-create" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.349020 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b51e33-73e3-4dc5-83a7-fcbd0cc69930" containerName="glance-db-sync" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.349103 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd7c3ec-b278-41c6-a601-86db39696c8f" containerName="dnsmasq-dns" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.349186 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bcaa86-5455-4b12-b920-931d4aa00170" containerName="init" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.349286 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="29032f23-2ae4-4ec4-8a4c-8f647d120974" containerName="mariadb-account-create-update" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.350368 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.354794 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nk6xz"] Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.553791 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-config\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.553899 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.554035 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.554112 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.554143 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.554193 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4hqk\" (UniqueName: \"kubernetes.io/projected/0a1000e7-e5bf-483a-aaa5-d79003725c6d-kube-api-access-z4hqk\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.658500 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.658829 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.658870 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4hqk\" (UniqueName: \"kubernetes.io/projected/0a1000e7-e5bf-483a-aaa5-d79003725c6d-kube-api-access-z4hqk\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.658897 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-config\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.658954 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.658992 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.660362 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.660372 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.660402 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.660414 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.660471 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-config\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.678131 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4hqk\" (UniqueName: \"kubernetes.io/projected/0a1000e7-e5bf-483a-aaa5-d79003725c6d-kube-api-access-z4hqk\") pod \"dnsmasq-dns-57c957c4ff-nk6xz\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.911328 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" podUID="581ce2b8-5ed1-410e-ab1d-f860e76b8546" containerName="dnsmasq-dns" containerID="cri-o://2ca8d770aee264d2325b5eea9c3cfd3d849ec3e0c88d92ba70ab1d307521eb1d" gracePeriod=10 Dec 06 15:47:49 crc kubenswrapper[4848]: I1206 15:47:49.975128 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.294710 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.298203 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.300037 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.300436 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-45bqd" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.300819 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.322766 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.470301 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.470653 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-config-data\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.470690 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.470838 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.470875 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-logs\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.470904 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-scripts\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.470926 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6czlv\" (UniqueName: \"kubernetes.io/projected/549a24f1-9102-428e-903b-4f34a1ebb55e-kube-api-access-6czlv\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.505357 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.507313 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.512553 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.525503 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.572399 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.572467 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-config-data\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.572495 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.572578 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.572599 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-logs\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.572619 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-scripts\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.572635 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6czlv\" (UniqueName: \"kubernetes.io/projected/549a24f1-9102-428e-903b-4f34a1ebb55e-kube-api-access-6czlv\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.573793 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.575610 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-logs\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.576508 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.583956 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.584856 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-scripts\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.592380 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6czlv\" (UniqueName: \"kubernetes.io/projected/549a24f1-9102-428e-903b-4f34a1ebb55e-kube-api-access-6czlv\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.592819 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-config-data\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.611834 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.620053 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.673874 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-logs\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.673929 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5xl4\" (UniqueName: \"kubernetes.io/projected/9ee8617e-28af-4831-b486-b3d783eda739-kube-api-access-t5xl4\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.673973 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.674005 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.674037 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.674095 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.674155 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.775373 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.775480 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.776366 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-logs\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.776588 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.776626 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-logs\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.776647 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5xl4\" (UniqueName: \"kubernetes.io/projected/9ee8617e-28af-4831-b486-b3d783eda739-kube-api-access-t5xl4\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.776680 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.776721 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.776744 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.776888 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.780832 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.781378 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.781969 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.792244 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5xl4\" (UniqueName: \"kubernetes.io/projected/9ee8617e-28af-4831-b486-b3d783eda739-kube-api-access-t5xl4\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.806204 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.827116 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.931891 4848 generic.go:334] "Generic (PLEG): container finished" podID="581ce2b8-5ed1-410e-ab1d-f860e76b8546" containerID="2ca8d770aee264d2325b5eea9c3cfd3d849ec3e0c88d92ba70ab1d307521eb1d" exitCode=0 Dec 06 15:47:50 crc kubenswrapper[4848]: I1206 15:47:50.931937 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" event={"ID":"581ce2b8-5ed1-410e-ab1d-f860e76b8546","Type":"ContainerDied","Data":"2ca8d770aee264d2325b5eea9c3cfd3d849ec3e0c88d92ba70ab1d307521eb1d"} Dec 06 15:47:53 crc kubenswrapper[4848]: I1206 15:47:53.803063 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:53 crc kubenswrapper[4848]: I1206 15:47:53.944354 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-config\") pod \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " Dec 06 15:47:53 crc kubenswrapper[4848]: I1206 15:47:53.944427 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-svc\") pod \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " Dec 06 15:47:53 crc kubenswrapper[4848]: I1206 15:47:53.944454 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdrfj\" (UniqueName: \"kubernetes.io/projected/581ce2b8-5ed1-410e-ab1d-f860e76b8546-kube-api-access-kdrfj\") pod \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " Dec 06 15:47:53 crc kubenswrapper[4848]: I1206 15:47:53.944485 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-nb\") pod \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " Dec 06 15:47:53 crc kubenswrapper[4848]: I1206 15:47:53.944538 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-sb\") pod \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " Dec 06 15:47:53 crc kubenswrapper[4848]: I1206 15:47:53.944565 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-swift-storage-0\") pod \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\" (UID: \"581ce2b8-5ed1-410e-ab1d-f860e76b8546\") " Dec 06 15:47:53 crc kubenswrapper[4848]: I1206 15:47:53.949366 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/581ce2b8-5ed1-410e-ab1d-f860e76b8546-kube-api-access-kdrfj" (OuterVolumeSpecName: "kube-api-access-kdrfj") pod "581ce2b8-5ed1-410e-ab1d-f860e76b8546" (UID: "581ce2b8-5ed1-410e-ab1d-f860e76b8546"). InnerVolumeSpecName "kube-api-access-kdrfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:47:53 crc kubenswrapper[4848]: I1206 15:47:53.973109 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" event={"ID":"581ce2b8-5ed1-410e-ab1d-f860e76b8546","Type":"ContainerDied","Data":"e006201403a23cfb6ccfd2e9b6a555f9b59cab7da4669f80b229c2b3cbd63956"} Dec 06 15:47:53 crc kubenswrapper[4848]: I1206 15:47:53.973299 4848 scope.go:117] "RemoveContainer" containerID="2ca8d770aee264d2325b5eea9c3cfd3d849ec3e0c88d92ba70ab1d307521eb1d" Dec 06 15:47:53 crc kubenswrapper[4848]: I1206 15:47:53.973460 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-hkpld" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.006302 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "581ce2b8-5ed1-410e-ab1d-f860e76b8546" (UID: "581ce2b8-5ed1-410e-ab1d-f860e76b8546"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.006308 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-config" (OuterVolumeSpecName: "config") pod "581ce2b8-5ed1-410e-ab1d-f860e76b8546" (UID: "581ce2b8-5ed1-410e-ab1d-f860e76b8546"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.007266 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "581ce2b8-5ed1-410e-ab1d-f860e76b8546" (UID: "581ce2b8-5ed1-410e-ab1d-f860e76b8546"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.009304 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "581ce2b8-5ed1-410e-ab1d-f860e76b8546" (UID: "581ce2b8-5ed1-410e-ab1d-f860e76b8546"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.024111 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "581ce2b8-5ed1-410e-ab1d-f860e76b8546" (UID: "581ce2b8-5ed1-410e-ab1d-f860e76b8546"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.048850 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.049040 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.049053 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdrfj\" (UniqueName: \"kubernetes.io/projected/581ce2b8-5ed1-410e-ab1d-f860e76b8546-kube-api-access-kdrfj\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.049062 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.049072 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.049079 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/581ce2b8-5ed1-410e-ab1d-f860e76b8546-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.122819 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.179360 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.302195 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-hkpld"] Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.308719 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-hkpld"] Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.704349 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-7xkpk"] Dec 06 15:47:54 crc kubenswrapper[4848]: E1206 15:47:54.704867 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581ce2b8-5ed1-410e-ab1d-f860e76b8546" containerName="init" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.704888 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="581ce2b8-5ed1-410e-ab1d-f860e76b8546" containerName="init" Dec 06 15:47:54 crc kubenswrapper[4848]: E1206 15:47:54.704905 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581ce2b8-5ed1-410e-ab1d-f860e76b8546" containerName="dnsmasq-dns" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.704913 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="581ce2b8-5ed1-410e-ab1d-f860e76b8546" containerName="dnsmasq-dns" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.705140 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="581ce2b8-5ed1-410e-ab1d-f860e76b8546" containerName="dnsmasq-dns" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.706276 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.708263 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-dockercfg-zq8gd" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.708394 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.708612 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.725343 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-7xkpk"] Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.862263 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-combined-ca-bundle\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.862330 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-scripts\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.862385 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data-merged\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.862411 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkvjh\" (UniqueName: \"kubernetes.io/projected/a75f41ed-628b-4e88-8d67-ada299f1c7a9-kube-api-access-jkvjh\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.862484 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a75f41ed-628b-4e88-8d67-ada299f1c7a9-etc-podinfo\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.862509 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.964579 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a75f41ed-628b-4e88-8d67-ada299f1c7a9-etc-podinfo\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.964642 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.964761 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-combined-ca-bundle\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.964799 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-scripts\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.964844 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data-merged\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.964870 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkvjh\" (UniqueName: \"kubernetes.io/projected/a75f41ed-628b-4e88-8d67-ada299f1c7a9-kube-api-access-jkvjh\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.965589 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data-merged\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.970955 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a75f41ed-628b-4e88-8d67-ada299f1c7a9-etc-podinfo\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.972744 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-scripts\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.974454 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-combined-ca-bundle\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.976957 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.981673 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="581ce2b8-5ed1-410e-ab1d-f860e76b8546" path="/var/lib/kubelet/pods/581ce2b8-5ed1-410e-ab1d-f860e76b8546/volumes" Dec 06 15:47:54 crc kubenswrapper[4848]: I1206 15:47:54.984219 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkvjh\" (UniqueName: \"kubernetes.io/projected/a75f41ed-628b-4e88-8d67-ada299f1c7a9-kube-api-access-jkvjh\") pod \"ironic-db-sync-7xkpk\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:47:55 crc kubenswrapper[4848]: I1206 15:47:55.034585 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:48:04 crc kubenswrapper[4848]: E1206 15:48:04.955865 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 06 15:48:04 crc kubenswrapper[4848]: E1206 15:48:04.956652 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8qrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-f882q_openstack(46997064-cc24-406e-8971-0cdbad196707): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 15:48:04 crc kubenswrapper[4848]: E1206 15:48:04.958060 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-f882q" podUID="46997064-cc24-406e-8971-0cdbad196707" Dec 06 15:48:05 crc kubenswrapper[4848]: E1206 15:48:05.084762 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-f882q" podUID="46997064-cc24-406e-8971-0cdbad196707" Dec 06 15:48:06 crc kubenswrapper[4848]: I1206 15:48:06.092415 4848 generic.go:334] "Generic (PLEG): container finished" podID="f1107f25-5755-4061-9f94-711281b2c74a" containerID="39483f16e27cf26851cbe8c6f30260590a5b9f2670add6ad8922d4fd4ba4bd0f" exitCode=0 Dec 06 15:48:06 crc kubenswrapper[4848]: I1206 15:48:06.092465 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v5rxm" event={"ID":"f1107f25-5755-4061-9f94-711281b2c74a","Type":"ContainerDied","Data":"39483f16e27cf26851cbe8c6f30260590a5b9f2670add6ad8922d4fd4ba4bd0f"} Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.096950 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.178591 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt9lq\" (UniqueName: \"kubernetes.io/projected/f1107f25-5755-4061-9f94-711281b2c74a-kube-api-access-tt9lq\") pod \"f1107f25-5755-4061-9f94-711281b2c74a\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.178634 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-combined-ca-bundle\") pod \"f1107f25-5755-4061-9f94-711281b2c74a\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.178661 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-scripts\") pod \"f1107f25-5755-4061-9f94-711281b2c74a\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.178733 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-config-data\") pod \"f1107f25-5755-4061-9f94-711281b2c74a\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.178760 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-fernet-keys\") pod \"f1107f25-5755-4061-9f94-711281b2c74a\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.178788 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-credential-keys\") pod \"f1107f25-5755-4061-9f94-711281b2c74a\" (UID: \"f1107f25-5755-4061-9f94-711281b2c74a\") " Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.184387 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-scripts" (OuterVolumeSpecName: "scripts") pod "f1107f25-5755-4061-9f94-711281b2c74a" (UID: "f1107f25-5755-4061-9f94-711281b2c74a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.185047 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f1107f25-5755-4061-9f94-711281b2c74a" (UID: "f1107f25-5755-4061-9f94-711281b2c74a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.185526 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v5rxm" event={"ID":"f1107f25-5755-4061-9f94-711281b2c74a","Type":"ContainerDied","Data":"0df667375cb8fca8e19b21d5f9fcca0606484e379b8a172d2c3438f2a286a907"} Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.185592 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0df667375cb8fca8e19b21d5f9fcca0606484e379b8a172d2c3438f2a286a907" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.185656 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v5rxm" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.185784 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1107f25-5755-4061-9f94-711281b2c74a-kube-api-access-tt9lq" (OuterVolumeSpecName: "kube-api-access-tt9lq") pod "f1107f25-5755-4061-9f94-711281b2c74a" (UID: "f1107f25-5755-4061-9f94-711281b2c74a"). InnerVolumeSpecName "kube-api-access-tt9lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.189973 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f1107f25-5755-4061-9f94-711281b2c74a" (UID: "f1107f25-5755-4061-9f94-711281b2c74a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.203986 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1107f25-5755-4061-9f94-711281b2c74a" (UID: "f1107f25-5755-4061-9f94-711281b2c74a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.213337 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-config-data" (OuterVolumeSpecName: "config-data") pod "f1107f25-5755-4061-9f94-711281b2c74a" (UID: "f1107f25-5755-4061-9f94-711281b2c74a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.281233 4848 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.281265 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt9lq\" (UniqueName: \"kubernetes.io/projected/f1107f25-5755-4061-9f94-711281b2c74a-kube-api-access-tt9lq\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.281276 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.281286 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.281294 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:17 crc kubenswrapper[4848]: I1206 15:48:17.281302 4848 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f1107f25-5755-4061-9f94-711281b2c74a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:17 crc kubenswrapper[4848]: E1206 15:48:17.675114 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 06 15:48:17 crc kubenswrapper[4848]: E1206 15:48:17.675267 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4xbd6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-zqt4v_openstack(b4758d52-e17c-484e-96a3-4879daace03e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 15:48:17 crc kubenswrapper[4848]: E1206 15:48:17.676948 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-zqt4v" podUID="b4758d52-e17c-484e-96a3-4879daace03e" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.167425 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-v5rxm"] Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.174020 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-v5rxm"] Dec 06 15:48:18 crc kubenswrapper[4848]: E1206 15:48:18.195218 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-zqt4v" podUID="b4758d52-e17c-484e-96a3-4879daace03e" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.273929 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-q57fg"] Dec 06 15:48:18 crc kubenswrapper[4848]: E1206 15:48:18.274404 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1107f25-5755-4061-9f94-711281b2c74a" containerName="keystone-bootstrap" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.274424 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1107f25-5755-4061-9f94-711281b2c74a" containerName="keystone-bootstrap" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.274654 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1107f25-5755-4061-9f94-711281b2c74a" containerName="keystone-bootstrap" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.275610 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.277756 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.277807 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.277995 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.278030 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-s6twx" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.287767 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q57fg"] Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.298757 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-scripts\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.298971 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd6qx\" (UniqueName: \"kubernetes.io/projected/c533e36c-3e3e-4df0-85d8-81c87c5c8087-kube-api-access-xd6qx\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.299202 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-combined-ca-bundle\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.299267 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-config-data\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.299288 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-fernet-keys\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.299328 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-credential-keys\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.405898 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-combined-ca-bundle\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.406089 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-config-data\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.406181 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-fernet-keys\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.406230 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-credential-keys\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.406270 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-scripts\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.406456 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd6qx\" (UniqueName: \"kubernetes.io/projected/c533e36c-3e3e-4df0-85d8-81c87c5c8087-kube-api-access-xd6qx\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.411646 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-fernet-keys\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.411852 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-scripts\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.412738 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-credential-keys\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.413394 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-combined-ca-bundle\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.416431 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-config-data\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.425760 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd6qx\" (UniqueName: \"kubernetes.io/projected/c533e36c-3e3e-4df0-85d8-81c87c5c8087-kube-api-access-xd6qx\") pod \"keystone-bootstrap-q57fg\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.603154 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.748411 4848 scope.go:117] "RemoveContainer" containerID="01b9dc8d1f8eaf4f3a52e0505b230b5656877b3e5eb11f25409895692c31e645" Dec 06 15:48:18 crc kubenswrapper[4848]: E1206 15:48:18.777356 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 06 15:48:18 crc kubenswrapper[4848]: E1206 15:48:18.777565 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kncck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-s6zql_openstack(bb8d9713-c9fb-42c1-8496-03e949d82d8e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 15:48:18 crc kubenswrapper[4848]: E1206 15:48:18.778778 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-s6zql" podUID="bb8d9713-c9fb-42c1-8496-03e949d82d8e" Dec 06 15:48:18 crc kubenswrapper[4848]: I1206 15:48:18.997479 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1107f25-5755-4061-9f94-711281b2c74a" path="/var/lib/kubelet/pods/f1107f25-5755-4061-9f94-711281b2c74a/volumes" Dec 06 15:48:19 crc kubenswrapper[4848]: I1206 15:48:19.217320 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a05e5db4-33a2-403d-b7ba-8e70207374ae","Type":"ContainerStarted","Data":"ce6f1a1589c157e1c88cafc71550e049b87f6286b07e122a2d2ad9037d6fc382"} Dec 06 15:48:19 crc kubenswrapper[4848]: E1206 15:48:19.229500 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-s6zql" podUID="bb8d9713-c9fb-42c1-8496-03e949d82d8e" Dec 06 15:48:19 crc kubenswrapper[4848]: I1206 15:48:19.262243 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nk6xz"] Dec 06 15:48:19 crc kubenswrapper[4848]: W1206 15:48:19.286454 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a1000e7_e5bf_483a_aaa5_d79003725c6d.slice/crio-36b1eb77131f84373e8616e96bdaae1b248b57e3c95a397779a1904a784ae3bd WatchSource:0}: Error finding container 36b1eb77131f84373e8616e96bdaae1b248b57e3c95a397779a1904a784ae3bd: Status 404 returned error can't find the container with id 36b1eb77131f84373e8616e96bdaae1b248b57e3c95a397779a1904a784ae3bd Dec 06 15:48:19 crc kubenswrapper[4848]: I1206 15:48:19.342396 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-7xkpk"] Dec 06 15:48:19 crc kubenswrapper[4848]: W1206 15:48:19.348828 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75f41ed_628b_4e88_8d67_ada299f1c7a9.slice/crio-3d9ea34303a215aa37b3a328b5b46761dac59cd0123e29796cfb8d7419d66fea WatchSource:0}: Error finding container 3d9ea34303a215aa37b3a328b5b46761dac59cd0123e29796cfb8d7419d66fea: Status 404 returned error can't find the container with id 3d9ea34303a215aa37b3a328b5b46761dac59cd0123e29796cfb8d7419d66fea Dec 06 15:48:19 crc kubenswrapper[4848]: I1206 15:48:19.411902 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:48:19 crc kubenswrapper[4848]: I1206 15:48:19.464652 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q57fg"] Dec 06 15:48:20 crc kubenswrapper[4848]: I1206 15:48:20.231967 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q57fg" event={"ID":"c533e36c-3e3e-4df0-85d8-81c87c5c8087","Type":"ContainerStarted","Data":"c59f025526913bd153a362ea60cd92bc58106084b528e88b90f3e57cee1f71af"} Dec 06 15:48:20 crc kubenswrapper[4848]: I1206 15:48:20.232399 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q57fg" event={"ID":"c533e36c-3e3e-4df0-85d8-81c87c5c8087","Type":"ContainerStarted","Data":"f32cfbe8f0db285682595444d241cbf89e5fd46b0dc9fc68c661d635ad8af40a"} Dec 06 15:48:20 crc kubenswrapper[4848]: I1206 15:48:20.234364 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-7xkpk" event={"ID":"a75f41ed-628b-4e88-8d67-ada299f1c7a9","Type":"ContainerStarted","Data":"3d9ea34303a215aa37b3a328b5b46761dac59cd0123e29796cfb8d7419d66fea"} Dec 06 15:48:20 crc kubenswrapper[4848]: I1206 15:48:20.235543 4848 generic.go:334] "Generic (PLEG): container finished" podID="0a1000e7-e5bf-483a-aaa5-d79003725c6d" containerID="f72e273d36a4741dd55f1287edfac8615bb0312d13e8d23e0e4bb7a98da69b85" exitCode=0 Dec 06 15:48:20 crc kubenswrapper[4848]: I1206 15:48:20.235585 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" event={"ID":"0a1000e7-e5bf-483a-aaa5-d79003725c6d","Type":"ContainerDied","Data":"f72e273d36a4741dd55f1287edfac8615bb0312d13e8d23e0e4bb7a98da69b85"} Dec 06 15:48:20 crc kubenswrapper[4848]: I1206 15:48:20.235615 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" event={"ID":"0a1000e7-e5bf-483a-aaa5-d79003725c6d","Type":"ContainerStarted","Data":"36b1eb77131f84373e8616e96bdaae1b248b57e3c95a397779a1904a784ae3bd"} Dec 06 15:48:20 crc kubenswrapper[4848]: I1206 15:48:20.236576 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ee8617e-28af-4831-b486-b3d783eda739","Type":"ContainerStarted","Data":"2d635a0d3278d2db84fbfec17c292db8743950c32d76bdf4efd9998a4d1922db"} Dec 06 15:48:20 crc kubenswrapper[4848]: I1206 15:48:20.236597 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ee8617e-28af-4831-b486-b3d783eda739","Type":"ContainerStarted","Data":"ef78ce99022f4d6a87bc64c923ea2d9b82d32778df9ee9d25f95e557f05430bc"} Dec 06 15:48:20 crc kubenswrapper[4848]: I1206 15:48:20.249943 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-q57fg" podStartSLOduration=2.249922625 podStartE2EDuration="2.249922625s" podCreationTimestamp="2025-12-06 15:48:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:48:20.246899313 +0000 UTC m=+1167.544910226" watchObservedRunningTime="2025-12-06 15:48:20.249922625 +0000 UTC m=+1167.547933538" Dec 06 15:48:20 crc kubenswrapper[4848]: I1206 15:48:20.513741 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.250970 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" event={"ID":"0a1000e7-e5bf-483a-aaa5-d79003725c6d","Type":"ContainerStarted","Data":"01b266830cc511b134b207bbce4248b0ae82af6c1341804ffb120ca6b1fed4a2"} Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.251643 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.303669 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f882q" event={"ID":"46997064-cc24-406e-8971-0cdbad196707","Type":"ContainerStarted","Data":"05affa8c282e92cb4ca335526850d4aa59af7970528d18157444e3ef1ad7a5d2"} Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.318574 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" podStartSLOduration=32.31855208 podStartE2EDuration="32.31855208s" podCreationTimestamp="2025-12-06 15:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:48:21.295097744 +0000 UTC m=+1168.593108667" watchObservedRunningTime="2025-12-06 15:48:21.31855208 +0000 UTC m=+1168.616562993" Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.331191 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ee8617e-28af-4831-b486-b3d783eda739","Type":"ContainerStarted","Data":"ba9b588a7a3cff1cdaf5f403c0aefade9722a7d1a2e2a54340c98b40a821a82f"} Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.331260 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9ee8617e-28af-4831-b486-b3d783eda739" containerName="glance-log" containerID="cri-o://2d635a0d3278d2db84fbfec17c292db8743950c32d76bdf4efd9998a4d1922db" gracePeriod=30 Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.331407 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9ee8617e-28af-4831-b486-b3d783eda739" containerName="glance-httpd" containerID="cri-o://ba9b588a7a3cff1cdaf5f403c0aefade9722a7d1a2e2a54340c98b40a821a82f" gracePeriod=30 Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.332738 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-f882q" podStartSLOduration=2.501353288 podStartE2EDuration="37.332728114s" podCreationTimestamp="2025-12-06 15:47:44 +0000 UTC" firstStartedPulling="2025-12-06 15:47:45.920628153 +0000 UTC m=+1133.218639056" lastFinishedPulling="2025-12-06 15:48:20.752002969 +0000 UTC m=+1168.050013882" observedRunningTime="2025-12-06 15:48:21.329827796 +0000 UTC m=+1168.627838719" watchObservedRunningTime="2025-12-06 15:48:21.332728114 +0000 UTC m=+1168.630739027" Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.337512 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a05e5db4-33a2-403d-b7ba-8e70207374ae","Type":"ContainerStarted","Data":"b6c00fb3993383a7f44505922d76417b105578e1867407554d7f2d89e992e320"} Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.351184 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"549a24f1-9102-428e-903b-4f34a1ebb55e","Type":"ContainerStarted","Data":"1c878e5c61b6d4d26b1402cf63219321022c05053e761ed832ca83404b5af29e"} Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.351228 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"549a24f1-9102-428e-903b-4f34a1ebb55e","Type":"ContainerStarted","Data":"debcc98e69c340f1ebad1e719f179d552ed293796acd6ea22b7d53878013a1ad"} Dec 06 15:48:21 crc kubenswrapper[4848]: I1206 15:48:21.365633 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=32.365613536 podStartE2EDuration="32.365613536s" podCreationTimestamp="2025-12-06 15:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:48:21.351164774 +0000 UTC m=+1168.649175687" watchObservedRunningTime="2025-12-06 15:48:21.365613536 +0000 UTC m=+1168.663624449" Dec 06 15:48:22 crc kubenswrapper[4848]: I1206 15:48:22.362401 4848 generic.go:334] "Generic (PLEG): container finished" podID="9ee8617e-28af-4831-b486-b3d783eda739" containerID="2d635a0d3278d2db84fbfec17c292db8743950c32d76bdf4efd9998a4d1922db" exitCode=143 Dec 06 15:48:22 crc kubenswrapper[4848]: I1206 15:48:22.362490 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ee8617e-28af-4831-b486-b3d783eda739","Type":"ContainerDied","Data":"2d635a0d3278d2db84fbfec17c292db8743950c32d76bdf4efd9998a4d1922db"} Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.377495 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"549a24f1-9102-428e-903b-4f34a1ebb55e","Type":"ContainerStarted","Data":"9ef2595fe038a309746d865a5f3eabd60934da52fdef527000e3e77079a1bf14"} Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.377606 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="549a24f1-9102-428e-903b-4f34a1ebb55e" containerName="glance-log" containerID="cri-o://1c878e5c61b6d4d26b1402cf63219321022c05053e761ed832ca83404b5af29e" gracePeriod=30 Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.377643 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="549a24f1-9102-428e-903b-4f34a1ebb55e" containerName="glance-httpd" containerID="cri-o://9ef2595fe038a309746d865a5f3eabd60934da52fdef527000e3e77079a1bf14" gracePeriod=30 Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.385348 4848 generic.go:334] "Generic (PLEG): container finished" podID="9ee8617e-28af-4831-b486-b3d783eda739" containerID="ba9b588a7a3cff1cdaf5f403c0aefade9722a7d1a2e2a54340c98b40a821a82f" exitCode=0 Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.385443 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ee8617e-28af-4831-b486-b3d783eda739","Type":"ContainerDied","Data":"ba9b588a7a3cff1cdaf5f403c0aefade9722a7d1a2e2a54340c98b40a821a82f"} Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.407449 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=34.407430487 podStartE2EDuration="34.407430487s" podCreationTimestamp="2025-12-06 15:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:48:23.401116296 +0000 UTC m=+1170.699127209" watchObservedRunningTime="2025-12-06 15:48:23.407430487 +0000 UTC m=+1170.705441400" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.564617 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.724386 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5xl4\" (UniqueName: \"kubernetes.io/projected/9ee8617e-28af-4831-b486-b3d783eda739-kube-api-access-t5xl4\") pod \"9ee8617e-28af-4831-b486-b3d783eda739\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.724451 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-scripts\") pod \"9ee8617e-28af-4831-b486-b3d783eda739\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.724526 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-logs\") pod \"9ee8617e-28af-4831-b486-b3d783eda739\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.724577 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9ee8617e-28af-4831-b486-b3d783eda739\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.724604 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-httpd-run\") pod \"9ee8617e-28af-4831-b486-b3d783eda739\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.724629 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-config-data\") pod \"9ee8617e-28af-4831-b486-b3d783eda739\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.724737 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-combined-ca-bundle\") pod \"9ee8617e-28af-4831-b486-b3d783eda739\" (UID: \"9ee8617e-28af-4831-b486-b3d783eda739\") " Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.725407 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9ee8617e-28af-4831-b486-b3d783eda739" (UID: "9ee8617e-28af-4831-b486-b3d783eda739"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.726020 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-logs" (OuterVolumeSpecName: "logs") pod "9ee8617e-28af-4831-b486-b3d783eda739" (UID: "9ee8617e-28af-4831-b486-b3d783eda739"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.730439 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "9ee8617e-28af-4831-b486-b3d783eda739" (UID: "9ee8617e-28af-4831-b486-b3d783eda739"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.730929 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-scripts" (OuterVolumeSpecName: "scripts") pod "9ee8617e-28af-4831-b486-b3d783eda739" (UID: "9ee8617e-28af-4831-b486-b3d783eda739"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.743014 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee8617e-28af-4831-b486-b3d783eda739-kube-api-access-t5xl4" (OuterVolumeSpecName: "kube-api-access-t5xl4") pod "9ee8617e-28af-4831-b486-b3d783eda739" (UID: "9ee8617e-28af-4831-b486-b3d783eda739"). InnerVolumeSpecName "kube-api-access-t5xl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.756962 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ee8617e-28af-4831-b486-b3d783eda739" (UID: "9ee8617e-28af-4831-b486-b3d783eda739"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.778424 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-config-data" (OuterVolumeSpecName: "config-data") pod "9ee8617e-28af-4831-b486-b3d783eda739" (UID: "9ee8617e-28af-4831-b486-b3d783eda739"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.826313 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.826353 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5xl4\" (UniqueName: \"kubernetes.io/projected/9ee8617e-28af-4831-b486-b3d783eda739-kube-api-access-t5xl4\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.826366 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.826375 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.826417 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.826428 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ee8617e-28af-4831-b486-b3d783eda739-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.826436 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee8617e-28af-4831-b486-b3d783eda739-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.854193 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 06 15:48:23 crc kubenswrapper[4848]: I1206 15:48:23.928220 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.412294 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ee8617e-28af-4831-b486-b3d783eda739","Type":"ContainerDied","Data":"ef78ce99022f4d6a87bc64c923ea2d9b82d32778df9ee9d25f95e557f05430bc"} Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.412609 4848 scope.go:117] "RemoveContainer" containerID="ba9b588a7a3cff1cdaf5f403c0aefade9722a7d1a2e2a54340c98b40a821a82f" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.412767 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.419797 4848 generic.go:334] "Generic (PLEG): container finished" podID="549a24f1-9102-428e-903b-4f34a1ebb55e" containerID="9ef2595fe038a309746d865a5f3eabd60934da52fdef527000e3e77079a1bf14" exitCode=0 Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.419832 4848 generic.go:334] "Generic (PLEG): container finished" podID="549a24f1-9102-428e-903b-4f34a1ebb55e" containerID="1c878e5c61b6d4d26b1402cf63219321022c05053e761ed832ca83404b5af29e" exitCode=143 Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.419855 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"549a24f1-9102-428e-903b-4f34a1ebb55e","Type":"ContainerDied","Data":"9ef2595fe038a309746d865a5f3eabd60934da52fdef527000e3e77079a1bf14"} Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.419883 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"549a24f1-9102-428e-903b-4f34a1ebb55e","Type":"ContainerDied","Data":"1c878e5c61b6d4d26b1402cf63219321022c05053e761ed832ca83404b5af29e"} Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.479290 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.507796 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.517630 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:48:24 crc kubenswrapper[4848]: E1206 15:48:24.519871 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee8617e-28af-4831-b486-b3d783eda739" containerName="glance-log" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.519892 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee8617e-28af-4831-b486-b3d783eda739" containerName="glance-log" Dec 06 15:48:24 crc kubenswrapper[4848]: E1206 15:48:24.519927 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee8617e-28af-4831-b486-b3d783eda739" containerName="glance-httpd" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.519934 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee8617e-28af-4831-b486-b3d783eda739" containerName="glance-httpd" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.520830 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee8617e-28af-4831-b486-b3d783eda739" containerName="glance-log" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.520883 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee8617e-28af-4831-b486-b3d783eda739" containerName="glance-httpd" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.521918 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.525531 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.527429 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.536283 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.639391 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.639488 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-logs\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.639519 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.639578 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.639617 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.639642 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.639798 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl2ps\" (UniqueName: \"kubernetes.io/projected/dbceed76-344f-499a-8f86-12bcd30a2936-kube-api-access-kl2ps\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.639917 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.742097 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.742171 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.742199 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.742282 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl2ps\" (UniqueName: \"kubernetes.io/projected/dbceed76-344f-499a-8f86-12bcd30a2936-kube-api-access-kl2ps\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.742312 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.742361 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.742402 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-logs\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.742425 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.743753 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.743779 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.744294 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-logs\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.748666 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.749434 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.750172 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.787042 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.792643 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.794877 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl2ps\" (UniqueName: \"kubernetes.io/projected/dbceed76-344f-499a-8f86-12bcd30a2936-kube-api-access-kl2ps\") pod \"glance-default-internal-api-0\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.861142 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:24 crc kubenswrapper[4848]: I1206 15:48:24.978183 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee8617e-28af-4831-b486-b3d783eda739" path="/var/lib/kubelet/pods/9ee8617e-28af-4831-b486-b3d783eda739/volumes" Dec 06 15:48:25 crc kubenswrapper[4848]: I1206 15:48:25.430294 4848 generic.go:334] "Generic (PLEG): container finished" podID="c533e36c-3e3e-4df0-85d8-81c87c5c8087" containerID="c59f025526913bd153a362ea60cd92bc58106084b528e88b90f3e57cee1f71af" exitCode=0 Dec 06 15:48:25 crc kubenswrapper[4848]: I1206 15:48:25.430349 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q57fg" event={"ID":"c533e36c-3e3e-4df0-85d8-81c87c5c8087","Type":"ContainerDied","Data":"c59f025526913bd153a362ea60cd92bc58106084b528e88b90f3e57cee1f71af"} Dec 06 15:48:26 crc kubenswrapper[4848]: I1206 15:48:26.443305 4848 generic.go:334] "Generic (PLEG): container finished" podID="46997064-cc24-406e-8971-0cdbad196707" containerID="05affa8c282e92cb4ca335526850d4aa59af7970528d18157444e3ef1ad7a5d2" exitCode=0 Dec 06 15:48:26 crc kubenswrapper[4848]: I1206 15:48:26.443421 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f882q" event={"ID":"46997064-cc24-406e-8971-0cdbad196707","Type":"ContainerDied","Data":"05affa8c282e92cb4ca335526850d4aa59af7970528d18157444e3ef1ad7a5d2"} Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.243121 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.249241 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.410784 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-httpd-run\") pod \"549a24f1-9102-428e-903b-4f34a1ebb55e\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411231 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-credential-keys\") pod \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411263 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd6qx\" (UniqueName: \"kubernetes.io/projected/c533e36c-3e3e-4df0-85d8-81c87c5c8087-kube-api-access-xd6qx\") pod \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411281 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "549a24f1-9102-428e-903b-4f34a1ebb55e" (UID: "549a24f1-9102-428e-903b-4f34a1ebb55e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411360 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-scripts\") pod \"549a24f1-9102-428e-903b-4f34a1ebb55e\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411400 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-config-data\") pod \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411470 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-logs\") pod \"549a24f1-9102-428e-903b-4f34a1ebb55e\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411508 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"549a24f1-9102-428e-903b-4f34a1ebb55e\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411551 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-config-data\") pod \"549a24f1-9102-428e-903b-4f34a1ebb55e\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411583 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-scripts\") pod \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411605 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-combined-ca-bundle\") pod \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411627 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6czlv\" (UniqueName: \"kubernetes.io/projected/549a24f1-9102-428e-903b-4f34a1ebb55e-kube-api-access-6czlv\") pod \"549a24f1-9102-428e-903b-4f34a1ebb55e\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411673 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-fernet-keys\") pod \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\" (UID: \"c533e36c-3e3e-4df0-85d8-81c87c5c8087\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.411743 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-combined-ca-bundle\") pod \"549a24f1-9102-428e-903b-4f34a1ebb55e\" (UID: \"549a24f1-9102-428e-903b-4f34a1ebb55e\") " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.412236 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.414087 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-logs" (OuterVolumeSpecName: "logs") pod "549a24f1-9102-428e-903b-4f34a1ebb55e" (UID: "549a24f1-9102-428e-903b-4f34a1ebb55e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.417231 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c533e36c-3e3e-4df0-85d8-81c87c5c8087" (UID: "c533e36c-3e3e-4df0-85d8-81c87c5c8087"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.417792 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-scripts" (OuterVolumeSpecName: "scripts") pod "c533e36c-3e3e-4df0-85d8-81c87c5c8087" (UID: "c533e36c-3e3e-4df0-85d8-81c87c5c8087"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.418242 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c533e36c-3e3e-4df0-85d8-81c87c5c8087-kube-api-access-xd6qx" (OuterVolumeSpecName: "kube-api-access-xd6qx") pod "c533e36c-3e3e-4df0-85d8-81c87c5c8087" (UID: "c533e36c-3e3e-4df0-85d8-81c87c5c8087"). InnerVolumeSpecName "kube-api-access-xd6qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.418318 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "549a24f1-9102-428e-903b-4f34a1ebb55e" (UID: "549a24f1-9102-428e-903b-4f34a1ebb55e"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.418544 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c533e36c-3e3e-4df0-85d8-81c87c5c8087" (UID: "c533e36c-3e3e-4df0-85d8-81c87c5c8087"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.420017 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549a24f1-9102-428e-903b-4f34a1ebb55e-kube-api-access-6czlv" (OuterVolumeSpecName: "kube-api-access-6czlv") pod "549a24f1-9102-428e-903b-4f34a1ebb55e" (UID: "549a24f1-9102-428e-903b-4f34a1ebb55e"). InnerVolumeSpecName "kube-api-access-6czlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.421075 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-scripts" (OuterVolumeSpecName: "scripts") pod "549a24f1-9102-428e-903b-4f34a1ebb55e" (UID: "549a24f1-9102-428e-903b-4f34a1ebb55e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.437607 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-config-data" (OuterVolumeSpecName: "config-data") pod "c533e36c-3e3e-4df0-85d8-81c87c5c8087" (UID: "c533e36c-3e3e-4df0-85d8-81c87c5c8087"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.444010 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c533e36c-3e3e-4df0-85d8-81c87c5c8087" (UID: "c533e36c-3e3e-4df0-85d8-81c87c5c8087"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.444105 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "549a24f1-9102-428e-903b-4f34a1ebb55e" (UID: "549a24f1-9102-428e-903b-4f34a1ebb55e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.452378 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q57fg" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.452560 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q57fg" event={"ID":"c533e36c-3e3e-4df0-85d8-81c87c5c8087","Type":"ContainerDied","Data":"f32cfbe8f0db285682595444d241cbf89e5fd46b0dc9fc68c661d635ad8af40a"} Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.452611 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f32cfbe8f0db285682595444d241cbf89e5fd46b0dc9fc68c661d635ad8af40a" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.454412 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.455073 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"549a24f1-9102-428e-903b-4f34a1ebb55e","Type":"ContainerDied","Data":"debcc98e69c340f1ebad1e719f179d552ed293796acd6ea22b7d53878013a1ad"} Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.462784 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-config-data" (OuterVolumeSpecName: "config-data") pod "549a24f1-9102-428e-903b-4f34a1ebb55e" (UID: "549a24f1-9102-428e-903b-4f34a1ebb55e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513531 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513569 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513585 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/549a24f1-9102-428e-903b-4f34a1ebb55e-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513623 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513635 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513646 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513658 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6czlv\" (UniqueName: \"kubernetes.io/projected/549a24f1-9102-428e-903b-4f34a1ebb55e-kube-api-access-6czlv\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513670 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513680 4848 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513691 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549a24f1-9102-428e-903b-4f34a1ebb55e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513720 4848 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c533e36c-3e3e-4df0-85d8-81c87c5c8087-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.513732 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd6qx\" (UniqueName: \"kubernetes.io/projected/c533e36c-3e3e-4df0-85d8-81c87c5c8087-kube-api-access-xd6qx\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.543011 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.616029 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.638839 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-84fbb4c9b8-bdccl"] Dec 06 15:48:27 crc kubenswrapper[4848]: E1206 15:48:27.646031 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549a24f1-9102-428e-903b-4f34a1ebb55e" containerName="glance-log" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.646054 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="549a24f1-9102-428e-903b-4f34a1ebb55e" containerName="glance-log" Dec 06 15:48:27 crc kubenswrapper[4848]: E1206 15:48:27.646076 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c533e36c-3e3e-4df0-85d8-81c87c5c8087" containerName="keystone-bootstrap" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.646084 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c533e36c-3e3e-4df0-85d8-81c87c5c8087" containerName="keystone-bootstrap" Dec 06 15:48:27 crc kubenswrapper[4848]: E1206 15:48:27.646097 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549a24f1-9102-428e-903b-4f34a1ebb55e" containerName="glance-httpd" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.646103 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="549a24f1-9102-428e-903b-4f34a1ebb55e" containerName="glance-httpd" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.646260 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="549a24f1-9102-428e-903b-4f34a1ebb55e" containerName="glance-httpd" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.646276 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c533e36c-3e3e-4df0-85d8-81c87c5c8087" containerName="keystone-bootstrap" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.646292 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="549a24f1-9102-428e-903b-4f34a1ebb55e" containerName="glance-log" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.648151 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.651005 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.651242 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.651337 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.651440 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-s6twx" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.651627 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.652338 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.682530 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-84fbb4c9b8-bdccl"] Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.794341 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.799657 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.822391 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-combined-ca-bundle\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.822473 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-scripts\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.822505 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-internal-tls-certs\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.822549 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-fernet-keys\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.822579 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-public-tls-certs\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.822598 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-config-data\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.822617 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-credential-keys\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.822625 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.822640 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k92rl\" (UniqueName: \"kubernetes.io/projected/63d987d2-da9e-4cfb-b409-2b5c66f307f8-kube-api-access-k92rl\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.824751 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.827263 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.827991 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.836721 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924096 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-logs\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924153 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfkzt\" (UniqueName: \"kubernetes.io/projected/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-kube-api-access-zfkzt\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924179 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k92rl\" (UniqueName: \"kubernetes.io/projected/63d987d2-da9e-4cfb-b409-2b5c66f307f8-kube-api-access-k92rl\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924199 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924241 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-combined-ca-bundle\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924321 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-scripts\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924348 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-internal-tls-certs\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924387 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924415 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924439 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-fernet-keys\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924458 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924482 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924506 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924527 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-public-tls-certs\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924546 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-config-data\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.924562 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-credential-keys\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.929044 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-credential-keys\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.929151 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-public-tls-certs\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.930577 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-fernet-keys\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.932151 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-combined-ca-bundle\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.935605 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-config-data\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.936521 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-scripts\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.938500 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d987d2-da9e-4cfb-b409-2b5c66f307f8-internal-tls-certs\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.944783 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k92rl\" (UniqueName: \"kubernetes.io/projected/63d987d2-da9e-4cfb-b409-2b5c66f307f8-kube-api-access-k92rl\") pod \"keystone-84fbb4c9b8-bdccl\" (UID: \"63d987d2-da9e-4cfb-b409-2b5c66f307f8\") " pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:27 crc kubenswrapper[4848]: I1206 15:48:27.972046 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.026570 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.026636 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.026683 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.026728 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.026756 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.026985 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-logs\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.027017 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfkzt\" (UniqueName: \"kubernetes.io/projected/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-kube-api-access-zfkzt\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.027036 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.027160 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.027411 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.027668 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-logs\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.031024 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.033690 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.034624 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.035137 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.043843 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfkzt\" (UniqueName: \"kubernetes.io/projected/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-kube-api-access-zfkzt\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.055050 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.141488 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 15:48:28 crc kubenswrapper[4848]: I1206 15:48:28.977789 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549a24f1-9102-428e-903b-4f34a1ebb55e" path="/var/lib/kubelet/pods/549a24f1-9102-428e-903b-4f34a1ebb55e/volumes" Dec 06 15:48:29 crc kubenswrapper[4848]: I1206 15:48:29.979942 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:48:30 crc kubenswrapper[4848]: I1206 15:48:30.104623 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lrwq2"] Dec 06 15:48:30 crc kubenswrapper[4848]: I1206 15:48:30.105046 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" podUID="47e6554b-9e4d-4d28-bd63-b379825e5396" containerName="dnsmasq-dns" containerID="cri-o://d620577155e2eb518cc205aba4690ce802be10bb6fa779aa9ccb985ea7767f34" gracePeriod=10 Dec 06 15:48:30 crc kubenswrapper[4848]: I1206 15:48:30.836307 4848 scope.go:117] "RemoveContainer" containerID="2d635a0d3278d2db84fbfec17c292db8743950c32d76bdf4efd9998a4d1922db" Dec 06 15:48:30 crc kubenswrapper[4848]: I1206 15:48:30.931824 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f882q" Dec 06 15:48:30 crc kubenswrapper[4848]: I1206 15:48:30.996957 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-scripts\") pod \"46997064-cc24-406e-8971-0cdbad196707\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " Dec 06 15:48:30 crc kubenswrapper[4848]: I1206 15:48:30.997399 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8qrw\" (UniqueName: \"kubernetes.io/projected/46997064-cc24-406e-8971-0cdbad196707-kube-api-access-x8qrw\") pod \"46997064-cc24-406e-8971-0cdbad196707\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " Dec 06 15:48:30 crc kubenswrapper[4848]: I1206 15:48:30.997431 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-config-data\") pod \"46997064-cc24-406e-8971-0cdbad196707\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " Dec 06 15:48:30 crc kubenswrapper[4848]: I1206 15:48:30.997499 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46997064-cc24-406e-8971-0cdbad196707-logs\") pod \"46997064-cc24-406e-8971-0cdbad196707\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " Dec 06 15:48:30 crc kubenswrapper[4848]: I1206 15:48:30.997607 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-combined-ca-bundle\") pod \"46997064-cc24-406e-8971-0cdbad196707\" (UID: \"46997064-cc24-406e-8971-0cdbad196707\") " Dec 06 15:48:30 crc kubenswrapper[4848]: I1206 15:48:30.998108 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46997064-cc24-406e-8971-0cdbad196707-logs" (OuterVolumeSpecName: "logs") pod "46997064-cc24-406e-8971-0cdbad196707" (UID: "46997064-cc24-406e-8971-0cdbad196707"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:48:30 crc kubenswrapper[4848]: I1206 15:48:30.999829 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46997064-cc24-406e-8971-0cdbad196707-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.003381 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-scripts" (OuterVolumeSpecName: "scripts") pod "46997064-cc24-406e-8971-0cdbad196707" (UID: "46997064-cc24-406e-8971-0cdbad196707"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.013549 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46997064-cc24-406e-8971-0cdbad196707-kube-api-access-x8qrw" (OuterVolumeSpecName: "kube-api-access-x8qrw") pod "46997064-cc24-406e-8971-0cdbad196707" (UID: "46997064-cc24-406e-8971-0cdbad196707"). InnerVolumeSpecName "kube-api-access-x8qrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.031203 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46997064-cc24-406e-8971-0cdbad196707" (UID: "46997064-cc24-406e-8971-0cdbad196707"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.037093 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-config-data" (OuterVolumeSpecName: "config-data") pod "46997064-cc24-406e-8971-0cdbad196707" (UID: "46997064-cc24-406e-8971-0cdbad196707"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.102299 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.102338 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.102348 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8qrw\" (UniqueName: \"kubernetes.io/projected/46997064-cc24-406e-8971-0cdbad196707-kube-api-access-x8qrw\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.102359 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46997064-cc24-406e-8971-0cdbad196707-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.509293 4848 generic.go:334] "Generic (PLEG): container finished" podID="47e6554b-9e4d-4d28-bd63-b379825e5396" containerID="d620577155e2eb518cc205aba4690ce802be10bb6fa779aa9ccb985ea7767f34" exitCode=0 Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.509367 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" event={"ID":"47e6554b-9e4d-4d28-bd63-b379825e5396","Type":"ContainerDied","Data":"d620577155e2eb518cc205aba4690ce802be10bb6fa779aa9ccb985ea7767f34"} Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.512439 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f882q" event={"ID":"46997064-cc24-406e-8971-0cdbad196707","Type":"ContainerDied","Data":"f27bf0f28bb1caf112739b385cb8c869fdda954d189f4627d8ea53adad1117c2"} Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.512469 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27bf0f28bb1caf112739b385cb8c869fdda954d189f4627d8ea53adad1117c2" Dec 06 15:48:31 crc kubenswrapper[4848]: I1206 15:48:31.512559 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f882q" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.036836 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b7b97f6b4-d9lkl"] Dec 06 15:48:32 crc kubenswrapper[4848]: E1206 15:48:32.037293 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46997064-cc24-406e-8971-0cdbad196707" containerName="placement-db-sync" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.037310 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="46997064-cc24-406e-8971-0cdbad196707" containerName="placement-db-sync" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.037769 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="46997064-cc24-406e-8971-0cdbad196707" containerName="placement-db-sync" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.039582 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.042664 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.042737 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.042838 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.042955 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.043339 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xx9dj" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.064681 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b7b97f6b4-d9lkl"] Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.123569 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fsq\" (UniqueName: \"kubernetes.io/projected/96280da8-11f8-49be-81a6-3bdcd053463f-kube-api-access-27fsq\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.123632 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-scripts\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.123734 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-public-tls-certs\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.123767 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-internal-tls-certs\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.123999 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-combined-ca-bundle\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.124051 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-config-data\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.124088 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96280da8-11f8-49be-81a6-3bdcd053463f-logs\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.225606 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-combined-ca-bundle\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.226010 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-config-data\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.226050 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96280da8-11f8-49be-81a6-3bdcd053463f-logs\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.226079 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27fsq\" (UniqueName: \"kubernetes.io/projected/96280da8-11f8-49be-81a6-3bdcd053463f-kube-api-access-27fsq\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.226101 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-scripts\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.226126 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-public-tls-certs\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.226149 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-internal-tls-certs\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.227018 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96280da8-11f8-49be-81a6-3bdcd053463f-logs\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.229792 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-combined-ca-bundle\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.229806 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-scripts\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.230250 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-internal-tls-certs\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.230265 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-public-tls-certs\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.231159 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96280da8-11f8-49be-81a6-3bdcd053463f-config-data\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.261580 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fsq\" (UniqueName: \"kubernetes.io/projected/96280da8-11f8-49be-81a6-3bdcd053463f-kube-api-access-27fsq\") pod \"placement-b7b97f6b4-d9lkl\" (UID: \"96280da8-11f8-49be-81a6-3bdcd053463f\") " pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:32 crc kubenswrapper[4848]: I1206 15:48:32.373473 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:33 crc kubenswrapper[4848]: I1206 15:48:33.463683 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" podUID="47e6554b-9e4d-4d28-bd63-b379825e5396" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 06 15:48:34 crc kubenswrapper[4848]: I1206 15:48:34.824037 4848 scope.go:117] "RemoveContainer" containerID="9ef2595fe038a309746d865a5f3eabd60934da52fdef527000e3e77079a1bf14" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.062846 4848 scope.go:117] "RemoveContainer" containerID="1c878e5c61b6d4d26b1402cf63219321022c05053e761ed832ca83404b5af29e" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.072285 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.183152 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjhh9\" (UniqueName: \"kubernetes.io/projected/47e6554b-9e4d-4d28-bd63-b379825e5396-kube-api-access-fjhh9\") pod \"47e6554b-9e4d-4d28-bd63-b379825e5396\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.183555 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-dns-svc\") pod \"47e6554b-9e4d-4d28-bd63-b379825e5396\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.183585 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-config\") pod \"47e6554b-9e4d-4d28-bd63-b379825e5396\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.183611 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-nb\") pod \"47e6554b-9e4d-4d28-bd63-b379825e5396\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.183662 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-sb\") pod \"47e6554b-9e4d-4d28-bd63-b379825e5396\" (UID: \"47e6554b-9e4d-4d28-bd63-b379825e5396\") " Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.188201 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e6554b-9e4d-4d28-bd63-b379825e5396-kube-api-access-fjhh9" (OuterVolumeSpecName: "kube-api-access-fjhh9") pod "47e6554b-9e4d-4d28-bd63-b379825e5396" (UID: "47e6554b-9e4d-4d28-bd63-b379825e5396"). InnerVolumeSpecName "kube-api-access-fjhh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.232790 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "47e6554b-9e4d-4d28-bd63-b379825e5396" (UID: "47e6554b-9e4d-4d28-bd63-b379825e5396"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.232841 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "47e6554b-9e4d-4d28-bd63-b379825e5396" (UID: "47e6554b-9e4d-4d28-bd63-b379825e5396"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.236200 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-config" (OuterVolumeSpecName: "config") pod "47e6554b-9e4d-4d28-bd63-b379825e5396" (UID: "47e6554b-9e4d-4d28-bd63-b379825e5396"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.252902 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "47e6554b-9e4d-4d28-bd63-b379825e5396" (UID: "47e6554b-9e4d-4d28-bd63-b379825e5396"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.285053 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.285086 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjhh9\" (UniqueName: \"kubernetes.io/projected/47e6554b-9e4d-4d28-bd63-b379825e5396-kube-api-access-fjhh9\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.285099 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.285108 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.285115 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47e6554b-9e4d-4d28-bd63-b379825e5396-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.453953 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.491938 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-84fbb4c9b8-bdccl"] Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.548499 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-7xkpk" event={"ID":"a75f41ed-628b-4e88-8d67-ada299f1c7a9","Type":"ContainerStarted","Data":"4f411aadb6ae18c2bf0b1ac58bd38e8de597d7adbd747c36805d8765972dd970"} Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.558021 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" event={"ID":"47e6554b-9e4d-4d28-bd63-b379825e5396","Type":"ContainerDied","Data":"487f81c86f593a3f4c50f2bf1a01352c4a88f5a9e3c252ff02d4d5f8cd430fe0"} Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.558082 4848 scope.go:117] "RemoveContainer" containerID="d620577155e2eb518cc205aba4690ce802be10bb6fa779aa9ccb985ea7767f34" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.558158 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-lrwq2" Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.564523 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.598964 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lrwq2"] Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.607459 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lrwq2"] Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.614070 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b7b97f6b4-d9lkl"] Dec 06 15:48:35 crc kubenswrapper[4848]: W1206 15:48:35.826115 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbceed76_344f_499a_8f86_12bcd30a2936.slice/crio-e37081cd8e0261cba52cfac12f36a0a640aa9b5d1b6e38876b13e3f3d1da98d6 WatchSource:0}: Error finding container e37081cd8e0261cba52cfac12f36a0a640aa9b5d1b6e38876b13e3f3d1da98d6: Status 404 returned error can't find the container with id e37081cd8e0261cba52cfac12f36a0a640aa9b5d1b6e38876b13e3f3d1da98d6 Dec 06 15:48:35 crc kubenswrapper[4848]: W1206 15:48:35.826632 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63d987d2_da9e_4cfb_b409_2b5c66f307f8.slice/crio-af112ba74c8bad2f39f98cca2a51c7fbe99db6d46781510a1973c94abc928109 WatchSource:0}: Error finding container af112ba74c8bad2f39f98cca2a51c7fbe99db6d46781510a1973c94abc928109: Status 404 returned error can't find the container with id af112ba74c8bad2f39f98cca2a51c7fbe99db6d46781510a1973c94abc928109 Dec 06 15:48:35 crc kubenswrapper[4848]: I1206 15:48:35.840954 4848 scope.go:117] "RemoveContainer" containerID="adc4dcf0df03ca9a4dac44c771792a69b112d099eaea4de31242817128024969" Dec 06 15:48:36 crc kubenswrapper[4848]: I1206 15:48:36.569469 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-84fbb4c9b8-bdccl" event={"ID":"63d987d2-da9e-4cfb-b409-2b5c66f307f8","Type":"ContainerStarted","Data":"80e958407bd8c67c05b141020e8e53aba7b4878a4958b86a8e53ee2b70372d15"} Dec 06 15:48:36 crc kubenswrapper[4848]: I1206 15:48:36.569827 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-84fbb4c9b8-bdccl" event={"ID":"63d987d2-da9e-4cfb-b409-2b5c66f307f8","Type":"ContainerStarted","Data":"af112ba74c8bad2f39f98cca2a51c7fbe99db6d46781510a1973c94abc928109"} Dec 06 15:48:36 crc kubenswrapper[4848]: I1206 15:48:36.573070 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zqt4v" event={"ID":"b4758d52-e17c-484e-96a3-4879daace03e","Type":"ContainerStarted","Data":"1f000778b9ab1df593521956c011d2ce5a5572717d7dfa8c2a84f7bdc0f3e387"} Dec 06 15:48:36 crc kubenswrapper[4848]: I1206 15:48:36.596500 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce","Type":"ContainerStarted","Data":"bf849cb99bef90fb2f53e1c045acb624718ab668ee73f18e1970562eed5c59fe"} Dec 06 15:48:36 crc kubenswrapper[4848]: I1206 15:48:36.600638 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-zqt4v" podStartSLOduration=2.152279059 podStartE2EDuration="52.600596068s" podCreationTimestamp="2025-12-06 15:47:44 +0000 UTC" firstStartedPulling="2025-12-06 15:47:45.607312384 +0000 UTC m=+1132.905323297" lastFinishedPulling="2025-12-06 15:48:36.055629393 +0000 UTC m=+1183.353640306" observedRunningTime="2025-12-06 15:48:36.592392586 +0000 UTC m=+1183.890403499" watchObservedRunningTime="2025-12-06 15:48:36.600596068 +0000 UTC m=+1183.898606981" Dec 06 15:48:36 crc kubenswrapper[4848]: I1206 15:48:36.608771 4848 generic.go:334] "Generic (PLEG): container finished" podID="a75f41ed-628b-4e88-8d67-ada299f1c7a9" containerID="4f411aadb6ae18c2bf0b1ac58bd38e8de597d7adbd747c36805d8765972dd970" exitCode=0 Dec 06 15:48:36 crc kubenswrapper[4848]: I1206 15:48:36.608834 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-7xkpk" event={"ID":"a75f41ed-628b-4e88-8d67-ada299f1c7a9","Type":"ContainerDied","Data":"4f411aadb6ae18c2bf0b1ac58bd38e8de597d7adbd747c36805d8765972dd970"} Dec 06 15:48:36 crc kubenswrapper[4848]: I1206 15:48:36.613724 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dbceed76-344f-499a-8f86-12bcd30a2936","Type":"ContainerStarted","Data":"e37081cd8e0261cba52cfac12f36a0a640aa9b5d1b6e38876b13e3f3d1da98d6"} Dec 06 15:48:36 crc kubenswrapper[4848]: I1206 15:48:36.626920 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b7b97f6b4-d9lkl" event={"ID":"96280da8-11f8-49be-81a6-3bdcd053463f","Type":"ContainerStarted","Data":"bc2146fe1297bf28b855aecfa4c508546f8e9b88c3a1bd0d77d15254491bc3af"} Dec 06 15:48:36 crc kubenswrapper[4848]: I1206 15:48:36.981194 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e6554b-9e4d-4d28-bd63-b379825e5396" path="/var/lib/kubelet/pods/47e6554b-9e4d-4d28-bd63-b379825e5396/volumes" Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.636988 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce","Type":"ContainerStarted","Data":"e56f0b5bbe6fd90187a3c4bf4a60a370762d5f91ff34a66306c98d05543a6f5a"} Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.637371 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce","Type":"ContainerStarted","Data":"b4bcb07f110ef58ee399698e916213a65c6de902e07a82a2db411be5fc9d05b8"} Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.639086 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-7xkpk" event={"ID":"a75f41ed-628b-4e88-8d67-ada299f1c7a9","Type":"ContainerStarted","Data":"26fd3b3245be8c11ead1dc1a59a18cf3fa6e34c5f6b7ed5999506338d06afbb3"} Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.651474 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dbceed76-344f-499a-8f86-12bcd30a2936","Type":"ContainerStarted","Data":"6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2"} Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.661684 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b7b97f6b4-d9lkl" event={"ID":"96280da8-11f8-49be-81a6-3bdcd053463f","Type":"ContainerStarted","Data":"2e7caf378ab26fb1a28127c5a855d61a0b65fbb1e45233024398ad52a7a5165b"} Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.661762 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b7b97f6b4-d9lkl" event={"ID":"96280da8-11f8-49be-81a6-3bdcd053463f","Type":"ContainerStarted","Data":"c3c1d4965d53e697386c4cea9ca8229aa2d54d10d1596e1fa731e092a7599185"} Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.661949 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.662244 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.684850 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s6zql" event={"ID":"bb8d9713-c9fb-42c1-8496-03e949d82d8e","Type":"ContainerStarted","Data":"db7fef949e7c57679ee0769869c038c2a6713e7c9061b14379f48c9031fd98bf"} Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.700825 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a05e5db4-33a2-403d-b7ba-8e70207374ae","Type":"ContainerStarted","Data":"75c04ce250a6891ba52d9fea851d2a71862d5bd158931e18887c0fffc4f6e7e6"} Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.700865 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.704545 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.704524695 podStartE2EDuration="10.704524695s" podCreationTimestamp="2025-12-06 15:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:48:37.690617809 +0000 UTC m=+1184.988628732" watchObservedRunningTime="2025-12-06 15:48:37.704524695 +0000 UTC m=+1185.002535608" Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.712362 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-s6zql" podStartSLOduration=3.10136191 podStartE2EDuration="53.712342697s" podCreationTimestamp="2025-12-06 15:47:44 +0000 UTC" firstStartedPulling="2025-12-06 15:47:45.444127852 +0000 UTC m=+1132.742138765" lastFinishedPulling="2025-12-06 15:48:36.055108639 +0000 UTC m=+1183.353119552" observedRunningTime="2025-12-06 15:48:37.711125094 +0000 UTC m=+1185.009136007" watchObservedRunningTime="2025-12-06 15:48:37.712342697 +0000 UTC m=+1185.010353610" Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.739709 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-b7b97f6b4-d9lkl" podStartSLOduration=5.739674276 podStartE2EDuration="5.739674276s" podCreationTimestamp="2025-12-06 15:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:48:37.732811911 +0000 UTC m=+1185.030822834" watchObservedRunningTime="2025-12-06 15:48:37.739674276 +0000 UTC m=+1185.037685189" Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.759679 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-7xkpk" podStartSLOduration=28.111011592 podStartE2EDuration="43.759660477s" podCreationTimestamp="2025-12-06 15:47:54 +0000 UTC" firstStartedPulling="2025-12-06 15:48:19.352814526 +0000 UTC m=+1166.650825440" lastFinishedPulling="2025-12-06 15:48:35.001463412 +0000 UTC m=+1182.299474325" observedRunningTime="2025-12-06 15:48:37.753421218 +0000 UTC m=+1185.051432131" watchObservedRunningTime="2025-12-06 15:48:37.759660477 +0000 UTC m=+1185.057671390" Dec 06 15:48:37 crc kubenswrapper[4848]: I1206 15:48:37.780181 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-84fbb4c9b8-bdccl" podStartSLOduration=10.780163142 podStartE2EDuration="10.780163142s" podCreationTimestamp="2025-12-06 15:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:48:37.773902053 +0000 UTC m=+1185.071912966" watchObservedRunningTime="2025-12-06 15:48:37.780163142 +0000 UTC m=+1185.078174055" Dec 06 15:48:38 crc kubenswrapper[4848]: I1206 15:48:38.142745 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 15:48:38 crc kubenswrapper[4848]: I1206 15:48:38.143802 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 15:48:38 crc kubenswrapper[4848]: I1206 15:48:38.169624 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 15:48:38 crc kubenswrapper[4848]: I1206 15:48:38.186144 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 15:48:38 crc kubenswrapper[4848]: I1206 15:48:38.725571 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dbceed76-344f-499a-8f86-12bcd30a2936","Type":"ContainerStarted","Data":"76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3"} Dec 06 15:48:38 crc kubenswrapper[4848]: I1206 15:48:38.725991 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 15:48:38 crc kubenswrapper[4848]: I1206 15:48:38.726424 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 15:48:38 crc kubenswrapper[4848]: I1206 15:48:38.749499 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.749480757 podStartE2EDuration="14.749480757s" podCreationTimestamp="2025-12-06 15:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:48:38.747015541 +0000 UTC m=+1186.045026454" watchObservedRunningTime="2025-12-06 15:48:38.749480757 +0000 UTC m=+1186.047491670" Dec 06 15:48:44 crc kubenswrapper[4848]: I1206 15:48:44.861951 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:44 crc kubenswrapper[4848]: I1206 15:48:44.862572 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:44 crc kubenswrapper[4848]: I1206 15:48:44.892508 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:44 crc kubenswrapper[4848]: I1206 15:48:44.903885 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:45 crc kubenswrapper[4848]: I1206 15:48:45.790195 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:45 crc kubenswrapper[4848]: I1206 15:48:45.790511 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:47 crc kubenswrapper[4848]: I1206 15:48:47.807834 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a05e5db4-33a2-403d-b7ba-8e70207374ae","Type":"ContainerStarted","Data":"9f9e694307b6017a96d838c035f11735b9466417f11e2bf68389e98afe7f1901"} Dec 06 15:48:47 crc kubenswrapper[4848]: I1206 15:48:47.808460 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 15:48:47 crc kubenswrapper[4848]: I1206 15:48:47.808107 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="sg-core" containerID="cri-o://75c04ce250a6891ba52d9fea851d2a71862d5bd158931e18887c0fffc4f6e7e6" gracePeriod=30 Dec 06 15:48:47 crc kubenswrapper[4848]: I1206 15:48:47.808021 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="ceilometer-central-agent" containerID="cri-o://ce6f1a1589c157e1c88cafc71550e049b87f6286b07e122a2d2ad9037d6fc382" gracePeriod=30 Dec 06 15:48:47 crc kubenswrapper[4848]: I1206 15:48:47.808118 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="ceilometer-notification-agent" containerID="cri-o://b6c00fb3993383a7f44505922d76417b105578e1867407554d7f2d89e992e320" gracePeriod=30 Dec 06 15:48:47 crc kubenswrapper[4848]: I1206 15:48:47.808125 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="proxy-httpd" containerID="cri-o://9f9e694307b6017a96d838c035f11735b9466417f11e2bf68389e98afe7f1901" gracePeriod=30 Dec 06 15:48:47 crc kubenswrapper[4848]: I1206 15:48:47.842214 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:47 crc kubenswrapper[4848]: I1206 15:48:47.842291 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:48:47 crc kubenswrapper[4848]: I1206 15:48:47.846608 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.034860091 podStartE2EDuration="1m3.846587337s" podCreationTimestamp="2025-12-06 15:47:44 +0000 UTC" firstStartedPulling="2025-12-06 15:47:45.443713171 +0000 UTC m=+1132.741724084" lastFinishedPulling="2025-12-06 15:48:46.255440427 +0000 UTC m=+1193.553451330" observedRunningTime="2025-12-06 15:48:47.834730906 +0000 UTC m=+1195.132741839" watchObservedRunningTime="2025-12-06 15:48:47.846587337 +0000 UTC m=+1195.144598260" Dec 06 15:48:47 crc kubenswrapper[4848]: I1206 15:48:47.855113 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 15:48:48 crc kubenswrapper[4848]: I1206 15:48:48.818868 4848 generic.go:334] "Generic (PLEG): container finished" podID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerID="75c04ce250a6891ba52d9fea851d2a71862d5bd158931e18887c0fffc4f6e7e6" exitCode=2 Dec 06 15:48:48 crc kubenswrapper[4848]: I1206 15:48:48.819077 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a05e5db4-33a2-403d-b7ba-8e70207374ae","Type":"ContainerDied","Data":"75c04ce250a6891ba52d9fea851d2a71862d5bd158931e18887c0fffc4f6e7e6"} Dec 06 15:48:49 crc kubenswrapper[4848]: I1206 15:48:49.843151 4848 generic.go:334] "Generic (PLEG): container finished" podID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerID="9f9e694307b6017a96d838c035f11735b9466417f11e2bf68389e98afe7f1901" exitCode=0 Dec 06 15:48:49 crc kubenswrapper[4848]: I1206 15:48:49.843210 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a05e5db4-33a2-403d-b7ba-8e70207374ae","Type":"ContainerDied","Data":"9f9e694307b6017a96d838c035f11735b9466417f11e2bf68389e98afe7f1901"} Dec 06 15:48:50 crc kubenswrapper[4848]: I1206 15:48:50.853949 4848 generic.go:334] "Generic (PLEG): container finished" podID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerID="b6c00fb3993383a7f44505922d76417b105578e1867407554d7f2d89e992e320" exitCode=0 Dec 06 15:48:50 crc kubenswrapper[4848]: I1206 15:48:50.853977 4848 generic.go:334] "Generic (PLEG): container finished" podID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerID="ce6f1a1589c157e1c88cafc71550e049b87f6286b07e122a2d2ad9037d6fc382" exitCode=0 Dec 06 15:48:50 crc kubenswrapper[4848]: I1206 15:48:50.854000 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a05e5db4-33a2-403d-b7ba-8e70207374ae","Type":"ContainerDied","Data":"b6c00fb3993383a7f44505922d76417b105578e1867407554d7f2d89e992e320"} Dec 06 15:48:50 crc kubenswrapper[4848]: I1206 15:48:50.854026 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a05e5db4-33a2-403d-b7ba-8e70207374ae","Type":"ContainerDied","Data":"ce6f1a1589c157e1c88cafc71550e049b87f6286b07e122a2d2ad9037d6fc382"} Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.266034 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.380115 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-sg-core-conf-yaml\") pod \"a05e5db4-33a2-403d-b7ba-8e70207374ae\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.380203 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-config-data\") pod \"a05e5db4-33a2-403d-b7ba-8e70207374ae\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.380389 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-scripts\") pod \"a05e5db4-33a2-403d-b7ba-8e70207374ae\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.380438 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-run-httpd\") pod \"a05e5db4-33a2-403d-b7ba-8e70207374ae\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.380484 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq9xt\" (UniqueName: \"kubernetes.io/projected/a05e5db4-33a2-403d-b7ba-8e70207374ae-kube-api-access-wq9xt\") pod \"a05e5db4-33a2-403d-b7ba-8e70207374ae\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.380526 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-log-httpd\") pod \"a05e5db4-33a2-403d-b7ba-8e70207374ae\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.380577 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-combined-ca-bundle\") pod \"a05e5db4-33a2-403d-b7ba-8e70207374ae\" (UID: \"a05e5db4-33a2-403d-b7ba-8e70207374ae\") " Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.381385 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a05e5db4-33a2-403d-b7ba-8e70207374ae" (UID: "a05e5db4-33a2-403d-b7ba-8e70207374ae"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.381606 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a05e5db4-33a2-403d-b7ba-8e70207374ae" (UID: "a05e5db4-33a2-403d-b7ba-8e70207374ae"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.385936 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-scripts" (OuterVolumeSpecName: "scripts") pod "a05e5db4-33a2-403d-b7ba-8e70207374ae" (UID: "a05e5db4-33a2-403d-b7ba-8e70207374ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.386756 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a05e5db4-33a2-403d-b7ba-8e70207374ae-kube-api-access-wq9xt" (OuterVolumeSpecName: "kube-api-access-wq9xt") pod "a05e5db4-33a2-403d-b7ba-8e70207374ae" (UID: "a05e5db4-33a2-403d-b7ba-8e70207374ae"). InnerVolumeSpecName "kube-api-access-wq9xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.408813 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a05e5db4-33a2-403d-b7ba-8e70207374ae" (UID: "a05e5db4-33a2-403d-b7ba-8e70207374ae"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.448320 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a05e5db4-33a2-403d-b7ba-8e70207374ae" (UID: "a05e5db4-33a2-403d-b7ba-8e70207374ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.469094 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-config-data" (OuterVolumeSpecName: "config-data") pod "a05e5db4-33a2-403d-b7ba-8e70207374ae" (UID: "a05e5db4-33a2-403d-b7ba-8e70207374ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.482523 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.482558 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.482567 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.482576 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a05e5db4-33a2-403d-b7ba-8e70207374ae-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.482585 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.482595 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq9xt\" (UniqueName: \"kubernetes.io/projected/a05e5db4-33a2-403d-b7ba-8e70207374ae-kube-api-access-wq9xt\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.482605 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a05e5db4-33a2-403d-b7ba-8e70207374ae-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.864505 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a05e5db4-33a2-403d-b7ba-8e70207374ae","Type":"ContainerDied","Data":"6f5502164c2e7bc140eaa1ca782f5af324b8e8a81bf68272b852cf737ac8f1d1"} Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.864851 4848 scope.go:117] "RemoveContainer" containerID="9f9e694307b6017a96d838c035f11735b9466417f11e2bf68389e98afe7f1901" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.864560 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.887018 4848 scope.go:117] "RemoveContainer" containerID="75c04ce250a6891ba52d9fea851d2a71862d5bd158931e18887c0fffc4f6e7e6" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.901906 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.914597 4848 scope.go:117] "RemoveContainer" containerID="b6c00fb3993383a7f44505922d76417b105578e1867407554d7f2d89e992e320" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.920026 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.936462 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:48:51 crc kubenswrapper[4848]: E1206 15:48:51.937184 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6554b-9e4d-4d28-bd63-b379825e5396" containerName="dnsmasq-dns" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.937343 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6554b-9e4d-4d28-bd63-b379825e5396" containerName="dnsmasq-dns" Dec 06 15:48:51 crc kubenswrapper[4848]: E1206 15:48:51.937433 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6554b-9e4d-4d28-bd63-b379825e5396" containerName="init" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.937512 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6554b-9e4d-4d28-bd63-b379825e5396" containerName="init" Dec 06 15:48:51 crc kubenswrapper[4848]: E1206 15:48:51.937598 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="sg-core" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.937671 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="sg-core" Dec 06 15:48:51 crc kubenswrapper[4848]: E1206 15:48:51.937782 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="ceilometer-central-agent" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.937871 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="ceilometer-central-agent" Dec 06 15:48:51 crc kubenswrapper[4848]: E1206 15:48:51.937959 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="ceilometer-notification-agent" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.938029 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="ceilometer-notification-agent" Dec 06 15:48:51 crc kubenswrapper[4848]: E1206 15:48:51.938102 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="proxy-httpd" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.938163 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="proxy-httpd" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.938419 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="ceilometer-central-agent" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.940211 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="proxy-httpd" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.940335 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="ceilometer-notification-agent" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.940431 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" containerName="sg-core" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.940515 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e6554b-9e4d-4d28-bd63-b379825e5396" containerName="dnsmasq-dns" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.940172 4848 scope.go:117] "RemoveContainer" containerID="ce6f1a1589c157e1c88cafc71550e049b87f6286b07e122a2d2ad9037d6fc382" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.944007 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.944433 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.946290 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.948831 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.989804 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pms\" (UniqueName: \"kubernetes.io/projected/fe0e76b7-c169-4b04-869f-d47bac964878-kube-api-access-67pms\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.989856 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-run-httpd\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.989945 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-config-data\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.990081 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-scripts\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.990162 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.990200 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:51 crc kubenswrapper[4848]: I1206 15:48:51.990231 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-log-httpd\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.091611 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.091689 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.091757 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-log-httpd\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.091779 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67pms\" (UniqueName: \"kubernetes.io/projected/fe0e76b7-c169-4b04-869f-d47bac964878-kube-api-access-67pms\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.091803 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-run-httpd\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.091883 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-config-data\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.091918 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-scripts\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.092421 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-log-httpd\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.092772 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-run-httpd\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.096168 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.096203 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.096441 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-scripts\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.097082 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-config-data\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.108982 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67pms\" (UniqueName: \"kubernetes.io/projected/fe0e76b7-c169-4b04-869f-d47bac964878-kube-api-access-67pms\") pod \"ceilometer-0\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.269857 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.817979 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:48:52 crc kubenswrapper[4848]: W1206 15:48:52.818614 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe0e76b7_c169_4b04_869f_d47bac964878.slice/crio-8a1fc0b35311ea583ce41393cbabb832aeb40729b628f9af8b09191fc5b11e9a WatchSource:0}: Error finding container 8a1fc0b35311ea583ce41393cbabb832aeb40729b628f9af8b09191fc5b11e9a: Status 404 returned error can't find the container with id 8a1fc0b35311ea583ce41393cbabb832aeb40729b628f9af8b09191fc5b11e9a Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.876025 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe0e76b7-c169-4b04-869f-d47bac964878","Type":"ContainerStarted","Data":"8a1fc0b35311ea583ce41393cbabb832aeb40729b628f9af8b09191fc5b11e9a"} Dec 06 15:48:52 crc kubenswrapper[4848]: I1206 15:48:52.978308 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a05e5db4-33a2-403d-b7ba-8e70207374ae" path="/var/lib/kubelet/pods/a05e5db4-33a2-403d-b7ba-8e70207374ae/volumes" Dec 06 15:48:54 crc kubenswrapper[4848]: I1206 15:48:54.893189 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe0e76b7-c169-4b04-869f-d47bac964878","Type":"ContainerStarted","Data":"2b55866da83362d77e35d1af6087ae24fa43dffe93c928496a31609e6d7ea2c0"} Dec 06 15:48:54 crc kubenswrapper[4848]: I1206 15:48:54.893718 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe0e76b7-c169-4b04-869f-d47bac964878","Type":"ContainerStarted","Data":"35b876245f7801f0343d52d8c0a744ab5322f9a08723d470b56fb0e4bc2abe09"} Dec 06 15:48:55 crc kubenswrapper[4848]: I1206 15:48:55.903816 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe0e76b7-c169-4b04-869f-d47bac964878","Type":"ContainerStarted","Data":"16d1f5871c65e7ca1763c3fabbd9b952396702c1ff6a0c2b6e7a158f0b7bcdfb"} Dec 06 15:48:56 crc kubenswrapper[4848]: I1206 15:48:56.914400 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe0e76b7-c169-4b04-869f-d47bac964878","Type":"ContainerStarted","Data":"4a29ade8abaf4b55fa3b0a45279dc04fdd4b1d3cda18421fc52a4073da2bd29e"} Dec 06 15:48:56 crc kubenswrapper[4848]: I1206 15:48:56.916212 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 15:48:56 crc kubenswrapper[4848]: I1206 15:48:56.947889 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.312155671 podStartE2EDuration="5.947864218s" podCreationTimestamp="2025-12-06 15:48:51 +0000 UTC" firstStartedPulling="2025-12-06 15:48:52.820763306 +0000 UTC m=+1200.118774219" lastFinishedPulling="2025-12-06 15:48:56.456471853 +0000 UTC m=+1203.754482766" observedRunningTime="2025-12-06 15:48:56.939637956 +0000 UTC m=+1204.237648869" watchObservedRunningTime="2025-12-06 15:48:56.947864218 +0000 UTC m=+1204.245875131" Dec 06 15:48:57 crc kubenswrapper[4848]: I1206 15:48:57.922652 4848 generic.go:334] "Generic (PLEG): container finished" podID="b4758d52-e17c-484e-96a3-4879daace03e" containerID="1f000778b9ab1df593521956c011d2ce5a5572717d7dfa8c2a84f7bdc0f3e387" exitCode=0 Dec 06 15:48:57 crc kubenswrapper[4848]: I1206 15:48:57.922738 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zqt4v" event={"ID":"b4758d52-e17c-484e-96a3-4879daace03e","Type":"ContainerDied","Data":"1f000778b9ab1df593521956c011d2ce5a5572717d7dfa8c2a84f7bdc0f3e387"} Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.272233 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.321028 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xbd6\" (UniqueName: \"kubernetes.io/projected/b4758d52-e17c-484e-96a3-4879daace03e-kube-api-access-4xbd6\") pod \"b4758d52-e17c-484e-96a3-4879daace03e\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.321263 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-combined-ca-bundle\") pod \"b4758d52-e17c-484e-96a3-4879daace03e\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.321299 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-db-sync-config-data\") pod \"b4758d52-e17c-484e-96a3-4879daace03e\" (UID: \"b4758d52-e17c-484e-96a3-4879daace03e\") " Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.327777 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4758d52-e17c-484e-96a3-4879daace03e-kube-api-access-4xbd6" (OuterVolumeSpecName: "kube-api-access-4xbd6") pod "b4758d52-e17c-484e-96a3-4879daace03e" (UID: "b4758d52-e17c-484e-96a3-4879daace03e"). InnerVolumeSpecName "kube-api-access-4xbd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.334828 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b4758d52-e17c-484e-96a3-4879daace03e" (UID: "b4758d52-e17c-484e-96a3-4879daace03e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.355319 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4758d52-e17c-484e-96a3-4879daace03e" (UID: "b4758d52-e17c-484e-96a3-4879daace03e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.423507 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xbd6\" (UniqueName: \"kubernetes.io/projected/b4758d52-e17c-484e-96a3-4879daace03e-kube-api-access-4xbd6\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.424013 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.424122 4848 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4758d52-e17c-484e-96a3-4879daace03e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.545924 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-84fbb4c9b8-bdccl" Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.939834 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zqt4v" event={"ID":"b4758d52-e17c-484e-96a3-4879daace03e","Type":"ContainerDied","Data":"996b6d7660ee46bb3609b4d3bae0fa3aeb44d8786132ec3f963359ba1b28fe59"} Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.939876 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="996b6d7660ee46bb3609b4d3bae0fa3aeb44d8786132ec3f963359ba1b28fe59" Dec 06 15:48:59 crc kubenswrapper[4848]: I1206 15:48:59.939937 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zqt4v" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.150107 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-69c9c94d7-cmv75"] Dec 06 15:49:00 crc kubenswrapper[4848]: E1206 15:49:00.150464 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4758d52-e17c-484e-96a3-4879daace03e" containerName="barbican-db-sync" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.150480 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4758d52-e17c-484e-96a3-4879daace03e" containerName="barbican-db-sync" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.150721 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4758d52-e17c-484e-96a3-4879daace03e" containerName="barbican-db-sync" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.159353 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.161854 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.172056 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.176785 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5ffck" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.178853 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69c9c94d7-cmv75"] Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.217555 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx"] Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.219222 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.223853 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.237566 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66ac417-22af-4413-afdd-d3b8006a5eb8-config-data\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.237625 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66ac417-22af-4413-afdd-d3b8006a5eb8-logs\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.237647 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66ac417-22af-4413-afdd-d3b8006a5eb8-combined-ca-bundle\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.237766 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgptf\" (UniqueName: \"kubernetes.io/projected/e66ac417-22af-4413-afdd-d3b8006a5eb8-kube-api-access-cgptf\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.237823 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66ac417-22af-4413-afdd-d3b8006a5eb8-config-data-custom\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.331100 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx"] Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.343988 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-config-data\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.344063 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66ac417-22af-4413-afdd-d3b8006a5eb8-config-data-custom\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.344093 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmxvt\" (UniqueName: \"kubernetes.io/projected/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-kube-api-access-qmxvt\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.344124 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-config-data-custom\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.344192 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66ac417-22af-4413-afdd-d3b8006a5eb8-config-data\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.344235 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66ac417-22af-4413-afdd-d3b8006a5eb8-logs\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.344262 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66ac417-22af-4413-afdd-d3b8006a5eb8-combined-ca-bundle\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.344297 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-logs\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.344353 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-combined-ca-bundle\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.344385 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgptf\" (UniqueName: \"kubernetes.io/projected/e66ac417-22af-4413-afdd-d3b8006a5eb8-kube-api-access-cgptf\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.355335 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-kp76h"] Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.355946 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66ac417-22af-4413-afdd-d3b8006a5eb8-logs\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.356658 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-kp76h"] Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.358243 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.360976 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66ac417-22af-4413-afdd-d3b8006a5eb8-config-data\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.363478 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66ac417-22af-4413-afdd-d3b8006a5eb8-config-data-custom\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.370963 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66ac417-22af-4413-afdd-d3b8006a5eb8-combined-ca-bundle\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.379506 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgptf\" (UniqueName: \"kubernetes.io/projected/e66ac417-22af-4413-afdd-d3b8006a5eb8-kube-api-access-cgptf\") pod \"barbican-worker-69c9c94d7-cmv75\" (UID: \"e66ac417-22af-4413-afdd-d3b8006a5eb8\") " pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.447618 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-config\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.447667 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.447711 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-logs\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.447735 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.447763 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.447791 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-combined-ca-bundle\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.447825 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8v8p\" (UniqueName: \"kubernetes.io/projected/7773ec39-baea-46cd-bd39-520ba343805d-kube-api-access-x8v8p\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.447874 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-config-data\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.447894 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmxvt\" (UniqueName: \"kubernetes.io/projected/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-kube-api-access-qmxvt\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.447920 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-config-data-custom\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.447959 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.448330 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-logs\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.454244 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-combined-ca-bundle\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.457307 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-config-data\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.459430 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-config-data-custom\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.479250 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmxvt\" (UniqueName: \"kubernetes.io/projected/318d0309-cf5f-4bfe-8c93-c72f13ce4a24-kube-api-access-qmxvt\") pod \"barbican-keystone-listener-84d7d7d8f8-jgnhx\" (UID: \"318d0309-cf5f-4bfe-8c93-c72f13ce4a24\") " pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.481764 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6cdcd56486-4mb97"] Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.483736 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.485906 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.488405 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69c9c94d7-cmv75" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.493933 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cdcd56486-4mb97"] Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.550039 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.550106 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.550158 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msldt\" (UniqueName: \"kubernetes.io/projected/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-kube-api-access-msldt\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.550185 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-logs\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.550246 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8v8p\" (UniqueName: \"kubernetes.io/projected/7773ec39-baea-46cd-bd39-520ba343805d-kube-api-access-x8v8p\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.550345 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.550374 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.550396 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-combined-ca-bundle\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.550455 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data-custom\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.550486 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-config\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.550518 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.557419 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.558227 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.558961 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.559967 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.573971 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-config\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.579831 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.597024 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8v8p\" (UniqueName: \"kubernetes.io/projected/7773ec39-baea-46cd-bd39-520ba343805d-kube-api-access-x8v8p\") pod \"dnsmasq-dns-6d66f584d7-kp76h\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.652086 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msldt\" (UniqueName: \"kubernetes.io/projected/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-kube-api-access-msldt\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.652134 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-logs\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.652294 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.652321 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-combined-ca-bundle\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.652373 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data-custom\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.653112 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-logs\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.658939 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data-custom\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.660812 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.663819 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-combined-ca-bundle\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.675162 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msldt\" (UniqueName: \"kubernetes.io/projected/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-kube-api-access-msldt\") pod \"barbican-api-6cdcd56486-4mb97\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.756127 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.760104 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:00 crc kubenswrapper[4848]: I1206 15:49:00.911260 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:01 crc kubenswrapper[4848]: I1206 15:49:01.008986 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.170831 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx"] Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.262888 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69c9c94d7-cmv75"] Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.367049 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-kp76h"] Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.412607 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cdcd56486-4mb97"] Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.568101 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.572391 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.575355 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-nkfvj" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.575537 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.575628 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.627515 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config-secret\") pod \"openstackclient\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.627823 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpnv6\" (UniqueName: \"kubernetes.io/projected/fbd6447e-5669-4705-948c-fb6b4083f67c-kube-api-access-wpnv6\") pod \"openstackclient\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.627939 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config\") pod \"openstackclient\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.628044 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.629903 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.733676 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpnv6\" (UniqueName: \"kubernetes.io/projected/fbd6447e-5669-4705-948c-fb6b4083f67c-kube-api-access-wpnv6\") pod \"openstackclient\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.733750 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config\") pod \"openstackclient\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.733780 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.733818 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config-secret\") pod \"openstackclient\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.736655 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config\") pod \"openstackclient\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: E1206 15:49:02.745504 4848 projected.go:194] Error preparing data for projected volume kube-api-access-wpnv6 for pod openstack/openstackclient: failed to fetch token: pods "openstackclient" not found Dec 06 15:49:02 crc kubenswrapper[4848]: E1206 15:49:02.745583 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbd6447e-5669-4705-948c-fb6b4083f67c-kube-api-access-wpnv6 podName:fbd6447e-5669-4705-948c-fb6b4083f67c nodeName:}" failed. No retries permitted until 2025-12-06 15:49:03.24556195 +0000 UTC m=+1210.543572863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wpnv6" (UniqueName: "kubernetes.io/projected/fbd6447e-5669-4705-948c-fb6b4083f67c-kube-api-access-wpnv6") pod "openstackclient" (UID: "fbd6447e-5669-4705-948c-fb6b4083f67c") : failed to fetch token: pods "openstackclient" not found Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.756785 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 06 15:49:02 crc kubenswrapper[4848]: E1206 15:49:02.758807 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-wpnv6 openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="fbd6447e-5669-4705-948c-fb6b4083f67c" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.765502 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.766621 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.769145 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config-secret\") pod \"openstackclient\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.816810 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.818060 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.824581 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.936721 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/28b28ed8-c6be-4256-8ccd-8c560959048b-openstack-config\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.936763 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/28b28ed8-c6be-4256-8ccd-8c560959048b-openstack-config-secret\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.936869 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b28ed8-c6be-4256-8ccd-8c560959048b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.936900 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spxsd\" (UniqueName: \"kubernetes.io/projected/28b28ed8-c6be-4256-8ccd-8c560959048b-kube-api-access-spxsd\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.983037 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdcd56486-4mb97" event={"ID":"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6","Type":"ContainerStarted","Data":"ca978b1092cf0b8ab8bc5fb6a3ddb8ca89c85d05ff505058d78e15218334e4de"} Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.983070 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdcd56486-4mb97" event={"ID":"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6","Type":"ContainerStarted","Data":"3d931f5c9c9a5eea838edccf4df2095a0e3ce4ab2391a3787d5bf30fafeb10bf"} Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.983080 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69c9c94d7-cmv75" event={"ID":"e66ac417-22af-4413-afdd-d3b8006a5eb8","Type":"ContainerStarted","Data":"97ef3ad435fbba15d4ca9df6fb4d474b3df2a869fa6f33b1478db0b78ecfd341"} Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.987509 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" event={"ID":"318d0309-cf5f-4bfe-8c93-c72f13ce4a24","Type":"ContainerStarted","Data":"c03905dc8d7317482fd1b3fa02f60d5291e3f25b1bd896a2d844a110e0cdfb79"} Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.995199 4848 generic.go:334] "Generic (PLEG): container finished" podID="7773ec39-baea-46cd-bd39-520ba343805d" containerID="8a2a638de6128b516c16d440911fd567c590742ae834c9213eb90b297bfc2179" exitCode=0 Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.995258 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.995848 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" event={"ID":"7773ec39-baea-46cd-bd39-520ba343805d","Type":"ContainerDied","Data":"8a2a638de6128b516c16d440911fd567c590742ae834c9213eb90b297bfc2179"} Dec 06 15:49:02 crc kubenswrapper[4848]: I1206 15:49:02.995872 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" event={"ID":"7773ec39-baea-46cd-bd39-520ba343805d","Type":"ContainerStarted","Data":"bf7d408da9660c368f5d99a4561e58c703c5c746c0cddf278dd1906eb236e34b"} Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.005885 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.038839 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b28ed8-c6be-4256-8ccd-8c560959048b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.039556 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spxsd\" (UniqueName: \"kubernetes.io/projected/28b28ed8-c6be-4256-8ccd-8c560959048b-kube-api-access-spxsd\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.039767 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/28b28ed8-c6be-4256-8ccd-8c560959048b-openstack-config\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.039878 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/28b28ed8-c6be-4256-8ccd-8c560959048b-openstack-config-secret\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.041155 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/28b28ed8-c6be-4256-8ccd-8c560959048b-openstack-config\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.048865 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b28ed8-c6be-4256-8ccd-8c560959048b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.050692 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/28b28ed8-c6be-4256-8ccd-8c560959048b-openstack-config-secret\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.055589 4848 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fbd6447e-5669-4705-948c-fb6b4083f67c" podUID="28b28ed8-c6be-4256-8ccd-8c560959048b" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.072558 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spxsd\" (UniqueName: \"kubernetes.io/projected/28b28ed8-c6be-4256-8ccd-8c560959048b-kube-api-access-spxsd\") pod \"openstackclient\" (UID: \"28b28ed8-c6be-4256-8ccd-8c560959048b\") " pod="openstack/openstackclient" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.141543 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config-secret\") pod \"fbd6447e-5669-4705-948c-fb6b4083f67c\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.141911 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-combined-ca-bundle\") pod \"fbd6447e-5669-4705-948c-fb6b4083f67c\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.142169 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config\") pod \"fbd6447e-5669-4705-948c-fb6b4083f67c\" (UID: \"fbd6447e-5669-4705-948c-fb6b4083f67c\") " Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.143598 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpnv6\" (UniqueName: \"kubernetes.io/projected/fbd6447e-5669-4705-948c-fb6b4083f67c-kube-api-access-wpnv6\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.144478 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.144682 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fbd6447e-5669-4705-948c-fb6b4083f67c" (UID: "fbd6447e-5669-4705-948c-fb6b4083f67c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.145233 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fbd6447e-5669-4705-948c-fb6b4083f67c" (UID: "fbd6447e-5669-4705-948c-fb6b4083f67c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.145339 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbd6447e-5669-4705-948c-fb6b4083f67c" (UID: "fbd6447e-5669-4705-948c-fb6b4083f67c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.183343 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5668cb4c58-xxwrf"] Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.184745 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.193057 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.193143 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.208747 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5668cb4c58-xxwrf"] Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.246138 4848 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.246180 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd6447e-5669-4705-948c-fb6b4083f67c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.246190 4848 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fbd6447e-5669-4705-948c-fb6b4083f67c-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.350818 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-logs\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.351192 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz4jm\" (UniqueName: \"kubernetes.io/projected/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-kube-api-access-cz4jm\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.351219 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-config-data\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.351256 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-config-data-custom\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.351278 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-combined-ca-bundle\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.351302 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-internal-tls-certs\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.351322 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-public-tls-certs\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.460657 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-public-tls-certs\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.460761 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-logs\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.460822 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz4jm\" (UniqueName: \"kubernetes.io/projected/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-kube-api-access-cz4jm\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.460841 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-config-data\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.460877 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-config-data-custom\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.460909 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-combined-ca-bundle\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.460932 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-internal-tls-certs\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.471210 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-logs\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.471277 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-config-data-custom\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.475547 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-public-tls-certs\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.480627 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-combined-ca-bundle\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.480633 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz4jm\" (UniqueName: \"kubernetes.io/projected/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-kube-api-access-cz4jm\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.481005 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-internal-tls-certs\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.513948 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae8c267-b7dd-4336-bedb-11c1e1bae7c3-config-data\") pod \"barbican-api-5668cb4c58-xxwrf\" (UID: \"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3\") " pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.539273 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:03 crc kubenswrapper[4848]: I1206 15:49:03.829968 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 15:49:04 crc kubenswrapper[4848]: I1206 15:49:04.017960 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdcd56486-4mb97" event={"ID":"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6","Type":"ContainerStarted","Data":"99e513c8ef9c3d15aa01e09162ff156ad51d7b6738fe58c478d253d9c27ea452"} Dec 06 15:49:04 crc kubenswrapper[4848]: I1206 15:49:04.018909 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:04 crc kubenswrapper[4848]: I1206 15:49:04.019099 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:04 crc kubenswrapper[4848]: I1206 15:49:04.021058 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 15:49:04 crc kubenswrapper[4848]: I1206 15:49:04.021113 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" event={"ID":"7773ec39-baea-46cd-bd39-520ba343805d","Type":"ContainerStarted","Data":"b1b36c9f28117d863bdf25da88665a65d2890fc8af18ab559c55fc4c67f33b56"} Dec 06 15:49:04 crc kubenswrapper[4848]: I1206 15:49:04.021192 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:04 crc kubenswrapper[4848]: I1206 15:49:04.036730 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6cdcd56486-4mb97" podStartSLOduration=4.036711023 podStartE2EDuration="4.036711023s" podCreationTimestamp="2025-12-06 15:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:04.033221978 +0000 UTC m=+1211.331232891" watchObservedRunningTime="2025-12-06 15:49:04.036711023 +0000 UTC m=+1211.334721936" Dec 06 15:49:04 crc kubenswrapper[4848]: I1206 15:49:04.056818 4848 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fbd6447e-5669-4705-948c-fb6b4083f67c" podUID="28b28ed8-c6be-4256-8ccd-8c560959048b" Dec 06 15:49:04 crc kubenswrapper[4848]: I1206 15:49:04.138729 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" podStartSLOduration=4.138713692 podStartE2EDuration="4.138713692s" podCreationTimestamp="2025-12-06 15:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:04.05399126 +0000 UTC m=+1211.352002183" watchObservedRunningTime="2025-12-06 15:49:04.138713692 +0000 UTC m=+1211.436724605" Dec 06 15:49:04 crc kubenswrapper[4848]: I1206 15:49:04.141768 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5668cb4c58-xxwrf"] Dec 06 15:49:04 crc kubenswrapper[4848]: W1206 15:49:04.894866 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28b28ed8_c6be_4256_8ccd_8c560959048b.slice/crio-249cd4ec9d014537e3570ebce0b19e98450b95f98af4453bb266253fb0de5d9b WatchSource:0}: Error finding container 249cd4ec9d014537e3570ebce0b19e98450b95f98af4453bb266253fb0de5d9b: Status 404 returned error can't find the container with id 249cd4ec9d014537e3570ebce0b19e98450b95f98af4453bb266253fb0de5d9b Dec 06 15:49:04 crc kubenswrapper[4848]: I1206 15:49:04.977258 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbd6447e-5669-4705-948c-fb6b4083f67c" path="/var/lib/kubelet/pods/fbd6447e-5669-4705-948c-fb6b4083f67c/volumes" Dec 06 15:49:05 crc kubenswrapper[4848]: I1206 15:49:05.051098 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"28b28ed8-c6be-4256-8ccd-8c560959048b","Type":"ContainerStarted","Data":"249cd4ec9d014537e3570ebce0b19e98450b95f98af4453bb266253fb0de5d9b"} Dec 06 15:49:05 crc kubenswrapper[4848]: I1206 15:49:05.052890 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5668cb4c58-xxwrf" event={"ID":"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3","Type":"ContainerStarted","Data":"209ac76f12d7be8f9fae6e3f14608c54b556405e83d76d97da87cab03e485ebd"} Dec 06 15:49:05 crc kubenswrapper[4848]: I1206 15:49:05.781131 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:49:05 crc kubenswrapper[4848]: I1206 15:49:05.781801 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b7b97f6b4-d9lkl" Dec 06 15:49:07 crc kubenswrapper[4848]: I1206 15:49:07.074935 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5668cb4c58-xxwrf" event={"ID":"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3","Type":"ContainerStarted","Data":"77405abf305cada21df22277a7d57a2942180f5e0af72a1fab4c0c25cd8a292f"} Dec 06 15:49:07 crc kubenswrapper[4848]: I1206 15:49:07.080650 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69c9c94d7-cmv75" event={"ID":"e66ac417-22af-4413-afdd-d3b8006a5eb8","Type":"ContainerStarted","Data":"9a939a7ae58a6511770eb623af29b380d89be5db69cef21ab6c3a7af5560252e"} Dec 06 15:49:07 crc kubenswrapper[4848]: I1206 15:49:07.081624 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" event={"ID":"318d0309-cf5f-4bfe-8c93-c72f13ce4a24","Type":"ContainerStarted","Data":"9cfca901ded7e5cf39b4eabe7102ed3f35d320e6f849a9a033295b28744c59b4"} Dec 06 15:49:08 crc kubenswrapper[4848]: I1206 15:49:08.094650 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5668cb4c58-xxwrf" event={"ID":"3ae8c267-b7dd-4336-bedb-11c1e1bae7c3","Type":"ContainerStarted","Data":"b4403b36f1382ef86edc756a035e3947f4f7f711f6cf8a7b1ed7f526abea3c95"} Dec 06 15:49:08 crc kubenswrapper[4848]: I1206 15:49:08.095079 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:08 crc kubenswrapper[4848]: I1206 15:49:08.095098 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:08 crc kubenswrapper[4848]: I1206 15:49:08.098733 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69c9c94d7-cmv75" event={"ID":"e66ac417-22af-4413-afdd-d3b8006a5eb8","Type":"ContainerStarted","Data":"dabe16ea9d1eb107aeb8045b61a8afac44b87c6f9f6bdfc5ee5c29e31f49c78d"} Dec 06 15:49:08 crc kubenswrapper[4848]: I1206 15:49:08.102544 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" event={"ID":"318d0309-cf5f-4bfe-8c93-c72f13ce4a24","Type":"ContainerStarted","Data":"4f116d2bfc26d6c19c1ed0d57ac883525c14e08aea12542de82a9e3b5cdda0d7"} Dec 06 15:49:08 crc kubenswrapper[4848]: I1206 15:49:08.120415 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5668cb4c58-xxwrf" podStartSLOduration=5.120393769 podStartE2EDuration="5.120393769s" podCreationTimestamp="2025-12-06 15:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:08.114408777 +0000 UTC m=+1215.412419690" watchObservedRunningTime="2025-12-06 15:49:08.120393769 +0000 UTC m=+1215.418404672" Dec 06 15:49:08 crc kubenswrapper[4848]: I1206 15:49:08.139373 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-84d7d7d8f8-jgnhx" podStartSLOduration=4.263547899 podStartE2EDuration="8.139346992s" podCreationTimestamp="2025-12-06 15:49:00 +0000 UTC" firstStartedPulling="2025-12-06 15:49:02.174592751 +0000 UTC m=+1209.472603674" lastFinishedPulling="2025-12-06 15:49:06.050391854 +0000 UTC m=+1213.348402767" observedRunningTime="2025-12-06 15:49:08.135933129 +0000 UTC m=+1215.433944042" watchObservedRunningTime="2025-12-06 15:49:08.139346992 +0000 UTC m=+1215.437357905" Dec 06 15:49:08 crc kubenswrapper[4848]: I1206 15:49:08.175459 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-69c9c94d7-cmv75" podStartSLOduration=4.412962873 podStartE2EDuration="8.175424149s" podCreationTimestamp="2025-12-06 15:49:00 +0000 UTC" firstStartedPulling="2025-12-06 15:49:02.286862399 +0000 UTC m=+1209.584873312" lastFinishedPulling="2025-12-06 15:49:06.049323675 +0000 UTC m=+1213.347334588" observedRunningTime="2025-12-06 15:49:08.153496635 +0000 UTC m=+1215.451507548" watchObservedRunningTime="2025-12-06 15:49:08.175424149 +0000 UTC m=+1215.473435062" Dec 06 15:49:09 crc kubenswrapper[4848]: I1206 15:49:09.113890 4848 generic.go:334] "Generic (PLEG): container finished" podID="bb8d9713-c9fb-42c1-8496-03e949d82d8e" containerID="db7fef949e7c57679ee0769869c038c2a6713e7c9061b14379f48c9031fd98bf" exitCode=0 Dec 06 15:49:09 crc kubenswrapper[4848]: I1206 15:49:09.114108 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s6zql" event={"ID":"bb8d9713-c9fb-42c1-8496-03e949d82d8e","Type":"ContainerDied","Data":"db7fef949e7c57679ee0769869c038c2a6713e7c9061b14379f48c9031fd98bf"} Dec 06 15:49:10 crc kubenswrapper[4848]: I1206 15:49:10.763351 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:10 crc kubenswrapper[4848]: I1206 15:49:10.824926 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nk6xz"] Dec 06 15:49:10 crc kubenswrapper[4848]: I1206 15:49:10.825182 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" podUID="0a1000e7-e5bf-483a-aaa5-d79003725c6d" containerName="dnsmasq-dns" containerID="cri-o://01b266830cc511b134b207bbce4248b0ae82af6c1341804ffb120ca6b1fed4a2" gracePeriod=10 Dec 06 15:49:11 crc kubenswrapper[4848]: I1206 15:49:11.148087 4848 generic.go:334] "Generic (PLEG): container finished" podID="0a1000e7-e5bf-483a-aaa5-d79003725c6d" containerID="01b266830cc511b134b207bbce4248b0ae82af6c1341804ffb120ca6b1fed4a2" exitCode=0 Dec 06 15:49:11 crc kubenswrapper[4848]: I1206 15:49:11.148250 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" event={"ID":"0a1000e7-e5bf-483a-aaa5-d79003725c6d","Type":"ContainerDied","Data":"01b266830cc511b134b207bbce4248b0ae82af6c1341804ffb120ca6b1fed4a2"} Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.093087 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6cdcd56486-4mb97" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.096538 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.096819 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cdcd56486-4mb97" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.103931 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6cdcd56486-4mb97" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.167013 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.170864 4848 generic.go:334] "Generic (PLEG): container finished" podID="dc0186b0-9bb6-401b-bec2-80ee1058b4e8" containerID="1d28c15762b6288c63e38c8b80841f096bf6cb44b59512f1451441ff95905739" exitCode=0 Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.170929 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rfkvg" event={"ID":"dc0186b0-9bb6-401b-bec2-80ee1058b4e8","Type":"ContainerDied","Data":"1d28c15762b6288c63e38c8b80841f096bf6cb44b59512f1451441ff95905739"} Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.956194 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-76858ffddc-pvnks"] Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.958226 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.962163 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.962173 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.962221 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 06 15:49:13 crc kubenswrapper[4848]: I1206 15:49:13.969483 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76858ffddc-pvnks"] Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.069271 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-internal-tls-certs\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.069377 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cjfj\" (UniqueName: \"kubernetes.io/projected/6c86a3f4-dccd-48e8-9169-a63eaaded209-kube-api-access-7cjfj\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.069487 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c86a3f4-dccd-48e8-9169-a63eaaded209-etc-swift\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.069518 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c86a3f4-dccd-48e8-9169-a63eaaded209-run-httpd\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.069546 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c86a3f4-dccd-48e8-9169-a63eaaded209-log-httpd\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.069961 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-config-data\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.070057 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-public-tls-certs\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.070120 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-combined-ca-bundle\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.172703 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-config-data\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.172771 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-public-tls-certs\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.172808 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-combined-ca-bundle\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.172849 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-internal-tls-certs\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.172882 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cjfj\" (UniqueName: \"kubernetes.io/projected/6c86a3f4-dccd-48e8-9169-a63eaaded209-kube-api-access-7cjfj\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.172931 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c86a3f4-dccd-48e8-9169-a63eaaded209-etc-swift\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.172952 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c86a3f4-dccd-48e8-9169-a63eaaded209-run-httpd\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.172978 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c86a3f4-dccd-48e8-9169-a63eaaded209-log-httpd\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.173497 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c86a3f4-dccd-48e8-9169-a63eaaded209-log-httpd\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.174046 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c86a3f4-dccd-48e8-9169-a63eaaded209-run-httpd\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.182556 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-combined-ca-bundle\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.182620 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c86a3f4-dccd-48e8-9169-a63eaaded209-etc-swift\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.183384 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-public-tls-certs\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.188031 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-config-data\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.191788 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c86a3f4-dccd-48e8-9169-a63eaaded209-internal-tls-certs\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.228169 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cjfj\" (UniqueName: \"kubernetes.io/projected/6c86a3f4-dccd-48e8-9169-a63eaaded209-kube-api-access-7cjfj\") pod \"swift-proxy-76858ffddc-pvnks\" (UID: \"6c86a3f4-dccd-48e8-9169-a63eaaded209\") " pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.291176 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.726960 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.727583 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="ceilometer-central-agent" containerID="cri-o://35b876245f7801f0343d52d8c0a744ab5322f9a08723d470b56fb0e4bc2abe09" gracePeriod=30 Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.727730 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="sg-core" containerID="cri-o://16d1f5871c65e7ca1763c3fabbd9b952396702c1ff6a0c2b6e7a158f0b7bcdfb" gracePeriod=30 Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.727735 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="proxy-httpd" containerID="cri-o://4a29ade8abaf4b55fa3b0a45279dc04fdd4b1d3cda18421fc52a4073da2bd29e" gracePeriod=30 Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.727786 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="ceilometer-notification-agent" containerID="cri-o://2b55866da83362d77e35d1af6087ae24fa43dffe93c928496a31609e6d7ea2c0" gracePeriod=30 Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.763757 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.150:3000/\": EOF" Dec 06 15:49:14 crc kubenswrapper[4848]: I1206 15:49:14.976521 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" podUID="0a1000e7-e5bf-483a-aaa5-d79003725c6d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: connect: connection refused" Dec 06 15:49:15 crc kubenswrapper[4848]: I1206 15:49:15.404175 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.204084 4848 generic.go:334] "Generic (PLEG): container finished" podID="fe0e76b7-c169-4b04-869f-d47bac964878" containerID="4a29ade8abaf4b55fa3b0a45279dc04fdd4b1d3cda18421fc52a4073da2bd29e" exitCode=0 Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.204306 4848 generic.go:334] "Generic (PLEG): container finished" podID="fe0e76b7-c169-4b04-869f-d47bac964878" containerID="16d1f5871c65e7ca1763c3fabbd9b952396702c1ff6a0c2b6e7a158f0b7bcdfb" exitCode=2 Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.204314 4848 generic.go:334] "Generic (PLEG): container finished" podID="fe0e76b7-c169-4b04-869f-d47bac964878" containerID="35b876245f7801f0343d52d8c0a744ab5322f9a08723d470b56fb0e4bc2abe09" exitCode=0 Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.204333 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe0e76b7-c169-4b04-869f-d47bac964878","Type":"ContainerDied","Data":"4a29ade8abaf4b55fa3b0a45279dc04fdd4b1d3cda18421fc52a4073da2bd29e"} Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.204356 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe0e76b7-c169-4b04-869f-d47bac964878","Type":"ContainerDied","Data":"16d1f5871c65e7ca1763c3fabbd9b952396702c1ff6a0c2b6e7a158f0b7bcdfb"} Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.204365 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe0e76b7-c169-4b04-869f-d47bac964878","Type":"ContainerDied","Data":"35b876245f7801f0343d52d8c0a744ab5322f9a08723d470b56fb0e4bc2abe09"} Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.519293 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5668cb4c58-xxwrf" Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.583133 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cdcd56486-4mb97"] Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.583487 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cdcd56486-4mb97" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api-log" containerID="cri-o://ca978b1092cf0b8ab8bc5fb6a3ddb8ca89c85d05ff505058d78e15218334e4de" gracePeriod=30 Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.583793 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cdcd56486-4mb97" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api" containerID="cri-o://99e513c8ef9c3d15aa01e09162ff156ad51d7b6738fe58c478d253d9c27ea452" gracePeriod=30 Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.610007 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6cdcd56486-4mb97" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.610340 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6cdcd56486-4mb97" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Dec 06 15:49:16 crc kubenswrapper[4848]: I1206 15:49:16.634174 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cdcd56486-4mb97" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Dec 06 15:49:17 crc kubenswrapper[4848]: I1206 15:49:17.215233 4848 generic.go:334] "Generic (PLEG): container finished" podID="fe0e76b7-c169-4b04-869f-d47bac964878" containerID="2b55866da83362d77e35d1af6087ae24fa43dffe93c928496a31609e6d7ea2c0" exitCode=0 Dec 06 15:49:17 crc kubenswrapper[4848]: I1206 15:49:17.215272 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe0e76b7-c169-4b04-869f-d47bac964878","Type":"ContainerDied","Data":"2b55866da83362d77e35d1af6087ae24fa43dffe93c928496a31609e6d7ea2c0"} Dec 06 15:49:18 crc kubenswrapper[4848]: I1206 15:49:18.225445 4848 generic.go:334] "Generic (PLEG): container finished" podID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerID="ca978b1092cf0b8ab8bc5fb6a3ddb8ca89c85d05ff505058d78e15218334e4de" exitCode=143 Dec 06 15:49:18 crc kubenswrapper[4848]: I1206 15:49:18.225524 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdcd56486-4mb97" event={"ID":"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6","Type":"ContainerDied","Data":"ca978b1092cf0b8ab8bc5fb6a3ddb8ca89c85d05ff505058d78e15218334e4de"} Dec 06 15:49:19 crc kubenswrapper[4848]: E1206 15:49:19.404257 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Dec 06 15:49:19 crc kubenswrapper[4848]: E1206 15:49:19.404392 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bdh5c8h5fbh68bh594h669hcdh698hb5h64h5fdh69h657h555h88h58fh589h67dhcch568h8dh57h5b5h65ch99h666h588h68bhfch596h576h5bbq,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-spxsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(28b28ed8-c6be-4256-8ccd-8c560959048b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 15:49:19 crc kubenswrapper[4848]: E1206 15:49:19.405761 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="28b28ed8-c6be-4256-8ccd-8c560959048b" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.507837 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.508143 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" containerName="glance-log" containerID="cri-o://b4bcb07f110ef58ee399698e916213a65c6de902e07a82a2db411be5fc9d05b8" gracePeriod=30 Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.508485 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" containerName="glance-httpd" containerID="cri-o://e56f0b5bbe6fd90187a3c4bf4a60a370762d5f91ff34a66306c98d05543a6f5a" gracePeriod=30 Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.618455 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s6zql" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.631817 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.800794 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-config\") pod \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.801577 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnbrb\" (UniqueName: \"kubernetes.io/projected/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-kube-api-access-vnbrb\") pod \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.801741 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-config-data\") pod \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.801875 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-db-sync-config-data\") pod \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.802752 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-combined-ca-bundle\") pod \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.802878 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-combined-ca-bundle\") pod \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\" (UID: \"dc0186b0-9bb6-401b-bec2-80ee1058b4e8\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.802967 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kncck\" (UniqueName: \"kubernetes.io/projected/bb8d9713-c9fb-42c1-8496-03e949d82d8e-kube-api-access-kncck\") pod \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.803034 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb8d9713-c9fb-42c1-8496-03e949d82d8e-etc-machine-id\") pod \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.803124 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-scripts\") pod \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\" (UID: \"bb8d9713-c9fb-42c1-8496-03e949d82d8e\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.804803 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb8d9713-c9fb-42c1-8496-03e949d82d8e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bb8d9713-c9fb-42c1-8496-03e949d82d8e" (UID: "bb8d9713-c9fb-42c1-8496-03e949d82d8e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.807158 4848 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb8d9713-c9fb-42c1-8496-03e949d82d8e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.809964 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-kube-api-access-vnbrb" (OuterVolumeSpecName: "kube-api-access-vnbrb") pod "dc0186b0-9bb6-401b-bec2-80ee1058b4e8" (UID: "dc0186b0-9bb6-401b-bec2-80ee1058b4e8"). InnerVolumeSpecName "kube-api-access-vnbrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.810150 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8d9713-c9fb-42c1-8496-03e949d82d8e-kube-api-access-kncck" (OuterVolumeSpecName: "kube-api-access-kncck") pod "bb8d9713-c9fb-42c1-8496-03e949d82d8e" (UID: "bb8d9713-c9fb-42c1-8496-03e949d82d8e"). InnerVolumeSpecName "kube-api-access-kncck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.810290 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.810632 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-scripts" (OuterVolumeSpecName: "scripts") pod "bb8d9713-c9fb-42c1-8496-03e949d82d8e" (UID: "bb8d9713-c9fb-42c1-8496-03e949d82d8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.867986 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bb8d9713-c9fb-42c1-8496-03e949d82d8e" (UID: "bb8d9713-c9fb-42c1-8496-03e949d82d8e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.877815 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-config" (OuterVolumeSpecName: "config") pod "dc0186b0-9bb6-401b-bec2-80ee1058b4e8" (UID: "dc0186b0-9bb6-401b-bec2-80ee1058b4e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.879178 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.903392 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb8d9713-c9fb-42c1-8496-03e949d82d8e" (UID: "bb8d9713-c9fb-42c1-8496-03e949d82d8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.908619 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4hqk\" (UniqueName: \"kubernetes.io/projected/0a1000e7-e5bf-483a-aaa5-d79003725c6d-kube-api-access-z4hqk\") pod \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.908744 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-nb\") pod \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.908886 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-swift-storage-0\") pod \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.909167 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-sb\") pod \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.909205 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-svc\") pod \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.909276 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-config\") pod \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\" (UID: \"0a1000e7-e5bf-483a-aaa5-d79003725c6d\") " Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.910113 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kncck\" (UniqueName: \"kubernetes.io/projected/bb8d9713-c9fb-42c1-8496-03e949d82d8e-kube-api-access-kncck\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.910135 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.910150 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.910162 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnbrb\" (UniqueName: \"kubernetes.io/projected/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-kube-api-access-vnbrb\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.910197 4848 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.910212 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.912367 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a1000e7-e5bf-483a-aaa5-d79003725c6d-kube-api-access-z4hqk" (OuterVolumeSpecName: "kube-api-access-z4hqk") pod "0a1000e7-e5bf-483a-aaa5-d79003725c6d" (UID: "0a1000e7-e5bf-483a-aaa5-d79003725c6d"). InnerVolumeSpecName "kube-api-access-z4hqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.916826 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc0186b0-9bb6-401b-bec2-80ee1058b4e8" (UID: "dc0186b0-9bb6-401b-bec2-80ee1058b4e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.933682 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-config-data" (OuterVolumeSpecName: "config-data") pod "bb8d9713-c9fb-42c1-8496-03e949d82d8e" (UID: "bb8d9713-c9fb-42c1-8496-03e949d82d8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.956590 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a1000e7-e5bf-483a-aaa5-d79003725c6d" (UID: "0a1000e7-e5bf-483a-aaa5-d79003725c6d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.959304 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a1000e7-e5bf-483a-aaa5-d79003725c6d" (UID: "0a1000e7-e5bf-483a-aaa5-d79003725c6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.961527 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0a1000e7-e5bf-483a-aaa5-d79003725c6d" (UID: "0a1000e7-e5bf-483a-aaa5-d79003725c6d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.967375 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-config" (OuterVolumeSpecName: "config") pod "0a1000e7-e5bf-483a-aaa5-d79003725c6d" (UID: "0a1000e7-e5bf-483a-aaa5-d79003725c6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:19 crc kubenswrapper[4848]: I1206 15:49:19.975159 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a1000e7-e5bf-483a-aaa5-d79003725c6d" (UID: "0a1000e7-e5bf-483a-aaa5-d79003725c6d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011215 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-combined-ca-bundle\") pod \"fe0e76b7-c169-4b04-869f-d47bac964878\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011304 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-config-data\") pod \"fe0e76b7-c169-4b04-869f-d47bac964878\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011356 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-sg-core-conf-yaml\") pod \"fe0e76b7-c169-4b04-869f-d47bac964878\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011438 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-scripts\") pod \"fe0e76b7-c169-4b04-869f-d47bac964878\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011463 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-run-httpd\") pod \"fe0e76b7-c169-4b04-869f-d47bac964878\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011509 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67pms\" (UniqueName: \"kubernetes.io/projected/fe0e76b7-c169-4b04-869f-d47bac964878-kube-api-access-67pms\") pod \"fe0e76b7-c169-4b04-869f-d47bac964878\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011551 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-log-httpd\") pod \"fe0e76b7-c169-4b04-869f-d47bac964878\" (UID: \"fe0e76b7-c169-4b04-869f-d47bac964878\") " Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011875 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4hqk\" (UniqueName: \"kubernetes.io/projected/0a1000e7-e5bf-483a-aaa5-d79003725c6d-kube-api-access-z4hqk\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011892 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011901 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011910 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8d9713-c9fb-42c1-8496-03e949d82d8e-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011919 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011927 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011936 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0186b0-9bb6-401b-bec2-80ee1058b4e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.011944 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a1000e7-e5bf-483a-aaa5-d79003725c6d-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.012691 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe0e76b7-c169-4b04-869f-d47bac964878" (UID: "fe0e76b7-c169-4b04-869f-d47bac964878"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.012757 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe0e76b7-c169-4b04-869f-d47bac964878" (UID: "fe0e76b7-c169-4b04-869f-d47bac964878"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.016767 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-scripts" (OuterVolumeSpecName: "scripts") pod "fe0e76b7-c169-4b04-869f-d47bac964878" (UID: "fe0e76b7-c169-4b04-869f-d47bac964878"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.017811 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0e76b7-c169-4b04-869f-d47bac964878-kube-api-access-67pms" (OuterVolumeSpecName: "kube-api-access-67pms") pod "fe0e76b7-c169-4b04-869f-d47bac964878" (UID: "fe0e76b7-c169-4b04-869f-d47bac964878"). InnerVolumeSpecName "kube-api-access-67pms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.048883 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe0e76b7-c169-4b04-869f-d47bac964878" (UID: "fe0e76b7-c169-4b04-869f-d47bac964878"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.076468 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe0e76b7-c169-4b04-869f-d47bac964878" (UID: "fe0e76b7-c169-4b04-869f-d47bac964878"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.098285 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-config-data" (OuterVolumeSpecName: "config-data") pod "fe0e76b7-c169-4b04-869f-d47bac964878" (UID: "fe0e76b7-c169-4b04-869f-d47bac964878"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.113123 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67pms\" (UniqueName: \"kubernetes.io/projected/fe0e76b7-c169-4b04-869f-d47bac964878-kube-api-access-67pms\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.113151 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.113160 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.113168 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.113177 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.113186 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0e76b7-c169-4b04-869f-d47bac964878-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.113196 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe0e76b7-c169-4b04-869f-d47bac964878-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.256323 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s6zql" event={"ID":"bb8d9713-c9fb-42c1-8496-03e949d82d8e","Type":"ContainerDied","Data":"710a275d993a5ddc2e08ba6406dc28dadeea12914faeb84d0e8748cf7d04e15e"} Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.256351 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s6zql" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.256360 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="710a275d993a5ddc2e08ba6406dc28dadeea12914faeb84d0e8748cf7d04e15e" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.257983 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rfkvg" event={"ID":"dc0186b0-9bb6-401b-bec2-80ee1058b4e8","Type":"ContainerDied","Data":"10d12c7c55595425ecac5750149eff17b92fc147fd678f5d6e4081a582ed07bd"} Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.258006 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10d12c7c55595425ecac5750149eff17b92fc147fd678f5d6e4081a582ed07bd" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.258164 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rfkvg" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.268054 4848 generic.go:334] "Generic (PLEG): container finished" podID="fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" containerID="b4bcb07f110ef58ee399698e916213a65c6de902e07a82a2db411be5fc9d05b8" exitCode=143 Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.268141 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce","Type":"ContainerDied","Data":"b4bcb07f110ef58ee399698e916213a65c6de902e07a82a2db411be5fc9d05b8"} Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.274078 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76858ffddc-pvnks"] Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.275573 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" event={"ID":"0a1000e7-e5bf-483a-aaa5-d79003725c6d","Type":"ContainerDied","Data":"36b1eb77131f84373e8616e96bdaae1b248b57e3c95a397779a1904a784ae3bd"} Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.275678 4848 scope.go:117] "RemoveContainer" containerID="01b266830cc511b134b207bbce4248b0ae82af6c1341804ffb120ca6b1fed4a2" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.275769 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-nk6xz" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.281247 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe0e76b7-c169-4b04-869f-d47bac964878","Type":"ContainerDied","Data":"8a1fc0b35311ea583ce41393cbabb832aeb40729b628f9af8b09191fc5b11e9a"} Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.281479 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: E1206 15:49:20.289243 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="28b28ed8-c6be-4256-8ccd-8c560959048b" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.312719 4848 scope.go:117] "RemoveContainer" containerID="f72e273d36a4741dd55f1287edfac8615bb0312d13e8d23e0e4bb7a98da69b85" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.331210 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.342484 4848 scope.go:117] "RemoveContainer" containerID="4a29ade8abaf4b55fa3b0a45279dc04fdd4b1d3cda18421fc52a4073da2bd29e" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.348053 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.360567 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nk6xz"] Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.369766 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-nk6xz"] Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.372186 4848 scope.go:117] "RemoveContainer" containerID="16d1f5871c65e7ca1763c3fabbd9b952396702c1ff6a0c2b6e7a158f0b7bcdfb" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.377104 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:20 crc kubenswrapper[4848]: E1206 15:49:20.377501 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="sg-core" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.377518 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="sg-core" Dec 06 15:49:20 crc kubenswrapper[4848]: E1206 15:49:20.377531 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="proxy-httpd" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.377537 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="proxy-httpd" Dec 06 15:49:20 crc kubenswrapper[4848]: E1206 15:49:20.377547 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="ceilometer-notification-agent" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.377553 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="ceilometer-notification-agent" Dec 06 15:49:20 crc kubenswrapper[4848]: E1206 15:49:20.377561 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0186b0-9bb6-401b-bec2-80ee1058b4e8" containerName="neutron-db-sync" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.377567 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0186b0-9bb6-401b-bec2-80ee1058b4e8" containerName="neutron-db-sync" Dec 06 15:49:20 crc kubenswrapper[4848]: E1206 15:49:20.377575 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a1000e7-e5bf-483a-aaa5-d79003725c6d" containerName="dnsmasq-dns" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.377580 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1000e7-e5bf-483a-aaa5-d79003725c6d" containerName="dnsmasq-dns" Dec 06 15:49:20 crc kubenswrapper[4848]: E1206 15:49:20.377592 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="ceilometer-central-agent" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.377598 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="ceilometer-central-agent" Dec 06 15:49:20 crc kubenswrapper[4848]: E1206 15:49:20.377606 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8d9713-c9fb-42c1-8496-03e949d82d8e" containerName="cinder-db-sync" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.377612 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8d9713-c9fb-42c1-8496-03e949d82d8e" containerName="cinder-db-sync" Dec 06 15:49:20 crc kubenswrapper[4848]: E1206 15:49:20.377621 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a1000e7-e5bf-483a-aaa5-d79003725c6d" containerName="init" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.377627 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1000e7-e5bf-483a-aaa5-d79003725c6d" containerName="init" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.378026 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="sg-core" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.378052 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="ceilometer-notification-agent" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.378063 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8d9713-c9fb-42c1-8496-03e949d82d8e" containerName="cinder-db-sync" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.378074 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="ceilometer-central-agent" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.378084 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc0186b0-9bb6-401b-bec2-80ee1058b4e8" containerName="neutron-db-sync" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.378098 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" containerName="proxy-httpd" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.378109 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a1000e7-e5bf-483a-aaa5-d79003725c6d" containerName="dnsmasq-dns" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.380096 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.382944 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.383021 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.385037 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.517933 4848 scope.go:117] "RemoveContainer" containerID="2b55866da83362d77e35d1af6087ae24fa43dffe93c928496a31609e6d7ea2c0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.520012 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-config-data\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.520043 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.520062 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-run-httpd\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.520077 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-log-httpd\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.520219 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wx69\" (UniqueName: \"kubernetes.io/projected/c5600511-fa48-4bd3-98be-0e823cac69b2-kube-api-access-6wx69\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.520278 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-scripts\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.520320 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.552796 4848 scope.go:117] "RemoveContainer" containerID="35b876245f7801f0343d52d8c0a744ab5322f9a08723d470b56fb0e4bc2abe09" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.622709 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-config-data\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.622786 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.622820 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-run-httpd\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.622861 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-log-httpd\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.622944 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wx69\" (UniqueName: \"kubernetes.io/projected/c5600511-fa48-4bd3-98be-0e823cac69b2-kube-api-access-6wx69\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.623012 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-scripts\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.623047 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.623280 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-run-httpd\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.623513 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-log-httpd\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.626679 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-scripts\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.627213 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.628343 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-config-data\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.628797 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.647647 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wx69\" (UniqueName: \"kubernetes.io/projected/c5600511-fa48-4bd3-98be-0e823cac69b2-kube-api-access-6wx69\") pod \"ceilometer-0\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.845658 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-85796f7496-8zdjc"] Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.847575 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.850195 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.850512 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.850618 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-db4wr"] Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.850649 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.851836 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k4486" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.855982 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.862338 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85796f7496-8zdjc"] Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.866559 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.874804 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-db4wr"] Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.932898 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.933209 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-svc\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.933241 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.933297 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx48s\" (UniqueName: \"kubernetes.io/projected/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-kube-api-access-hx48s\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.933335 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-config\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.933396 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.933643 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-combined-ca-bundle\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.933814 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlqsp\" (UniqueName: \"kubernetes.io/projected/b51797dd-e2f2-497c-a13c-921ab2868646-kube-api-access-hlqsp\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.934007 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-config\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.934149 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-httpd-config\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.934280 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-ovndb-tls-certs\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.970678 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.972306 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.975217 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.977090 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.977365 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-b5tjg" Dec 06 15:49:20 crc kubenswrapper[4848]: I1206 15:49:20.978834 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.041950 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a1000e7-e5bf-483a-aaa5-d79003725c6d" path="/var/lib/kubelet/pods/0a1000e7-e5bf-483a-aaa5-d79003725c6d/volumes" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.042714 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0e76b7-c169-4b04-869f-d47bac964878" path="/var/lib/kubelet/pods/fe0e76b7-c169-4b04-869f-d47bac964878/volumes" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.043419 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-db4wr"] Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.046902 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.046976 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-svc\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.047013 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6fc2a8-0d31-49f6-98be-60e56583631c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.047040 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.047224 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx48s\" (UniqueName: \"kubernetes.io/projected/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-kube-api-access-hx48s\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.047271 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-scripts\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.047296 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-config\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.047384 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.047431 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-combined-ca-bundle\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.047532 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.047591 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlqsp\" (UniqueName: \"kubernetes.io/projected/b51797dd-e2f2-497c-a13c-921ab2868646-kube-api-access-hlqsp\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.055351 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-svc\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.060146 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.060841 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-config\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: E1206 15:49:21.062467 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-hx48s ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-688c87cc99-db4wr" podUID="4f678f10-aaa1-4aba-a430-4d92fe3d3c78" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.063087 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.064956 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.084114 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-config\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.084185 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-httpd-config\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.084233 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.084266 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-ovndb-tls-certs\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.084326 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cmcq\" (UniqueName: \"kubernetes.io/projected/ec6fc2a8-0d31-49f6-98be-60e56583631c-kube-api-access-4cmcq\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.084878 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.092042 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.103264 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-httpd-config\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.104259 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-ovndb-tls-certs\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.124512 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-config\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.124606 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-combined-ca-bundle\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.131540 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlqsp\" (UniqueName: \"kubernetes.io/projected/b51797dd-e2f2-497c-a13c-921ab2868646-kube-api-access-hlqsp\") pod \"neutron-85796f7496-8zdjc\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.164920 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx48s\" (UniqueName: \"kubernetes.io/projected/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-kube-api-access-hx48s\") pod \"dnsmasq-dns-688c87cc99-db4wr\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.175630 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.187660 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6fc2a8-0d31-49f6-98be-60e56583631c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.187747 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-scripts\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.187794 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.187826 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.187864 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.187891 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cmcq\" (UniqueName: \"kubernetes.io/projected/ec6fc2a8-0d31-49f6-98be-60e56583631c-kube-api-access-4cmcq\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.188250 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6fc2a8-0d31-49f6-98be-60e56583631c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.199152 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-scripts\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.219919 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-wq7q5"] Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.222435 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.222676 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.229556 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.236585 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.239889 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-wq7q5"] Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.245605 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cmcq\" (UniqueName: \"kubernetes.io/projected/ec6fc2a8-0d31-49f6-98be-60e56583631c-kube-api-access-4cmcq\") pod \"cinder-scheduler-0\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.257823 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.265534 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.270646 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.279677 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.289228 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26qr7\" (UniqueName: \"kubernetes.io/projected/5e6a9170-ec3f-4704-8526-7736f298a496-kube-api-access-26qr7\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.289295 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.289316 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.289363 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.289405 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.289462 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-config\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.348972 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.370381 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76858ffddc-pvnks" event={"ID":"6c86a3f4-dccd-48e8-9169-a63eaaded209","Type":"ContainerStarted","Data":"7c9a779c89c8ee092401c80c3ba743c4b03bdc0061dd5203c6ef307922902ba4"} Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.390880 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-config\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.390932 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.390956 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.390976 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.391012 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26qr7\" (UniqueName: \"kubernetes.io/projected/5e6a9170-ec3f-4704-8526-7736f298a496-kube-api-access-26qr7\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.391047 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.391066 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.391091 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d65a058-3a8e-4083-8689-8627bf58faab-logs\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.391123 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.391139 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgrtx\" (UniqueName: \"kubernetes.io/projected/3d65a058-3a8e-4083-8689-8627bf58faab-kube-api-access-hgrtx\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.391172 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.391197 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-scripts\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.391230 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d65a058-3a8e-4083-8689-8627bf58faab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.392015 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-config\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.392781 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.393265 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.406099 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.406366 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dbceed76-344f-499a-8f86-12bcd30a2936" containerName="glance-log" containerID="cri-o://6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2" gracePeriod=30 Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.406851 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dbceed76-344f-499a-8f86-12bcd30a2936" containerName="glance-httpd" containerID="cri-o://76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3" gracePeriod=30 Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.409603 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.415239 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.431999 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.456656 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26qr7\" (UniqueName: \"kubernetes.io/projected/5e6a9170-ec3f-4704-8526-7736f298a496-kube-api-access-26qr7\") pod \"dnsmasq-dns-6bb4fc677f-wq7q5\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.494712 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d65a058-3a8e-4083-8689-8627bf58faab-logs\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.495004 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgrtx\" (UniqueName: \"kubernetes.io/projected/3d65a058-3a8e-4083-8689-8627bf58faab-kube-api-access-hgrtx\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.495058 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-scripts\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.495089 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d65a058-3a8e-4083-8689-8627bf58faab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.495121 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.495161 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.495181 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.495296 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d65a058-3a8e-4083-8689-8627bf58faab-logs\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.495375 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d65a058-3a8e-4083-8689-8627bf58faab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.504790 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.505268 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.511238 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.511938 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-scripts\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.523963 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.526244 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgrtx\" (UniqueName: \"kubernetes.io/projected/3d65a058-3a8e-4083-8689-8627bf58faab-kube-api-access-hgrtx\") pod \"cinder-api-0\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.573001 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.632602 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.673419 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.704311 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-svc\") pod \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.704382 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-config\") pod \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.704438 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-sb\") pod \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.704478 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-nb\") pod \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.704648 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx48s\" (UniqueName: \"kubernetes.io/projected/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-kube-api-access-hx48s\") pod \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.704771 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-swift-storage-0\") pod \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\" (UID: \"4f678f10-aaa1-4aba-a430-4d92fe3d3c78\") " Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.706129 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4f678f10-aaa1-4aba-a430-4d92fe3d3c78" (UID: "4f678f10-aaa1-4aba-a430-4d92fe3d3c78"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.706474 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4f678f10-aaa1-4aba-a430-4d92fe3d3c78" (UID: "4f678f10-aaa1-4aba-a430-4d92fe3d3c78"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.706670 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4f678f10-aaa1-4aba-a430-4d92fe3d3c78" (UID: "4f678f10-aaa1-4aba-a430-4d92fe3d3c78"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.706865 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f678f10-aaa1-4aba-a430-4d92fe3d3c78" (UID: "4f678f10-aaa1-4aba-a430-4d92fe3d3c78"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.707161 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-config" (OuterVolumeSpecName: "config") pod "4f678f10-aaa1-4aba-a430-4d92fe3d3c78" (UID: "4f678f10-aaa1-4aba-a430-4d92fe3d3c78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.728574 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-kube-api-access-hx48s" (OuterVolumeSpecName: "kube-api-access-hx48s") pod "4f678f10-aaa1-4aba-a430-4d92fe3d3c78" (UID: "4f678f10-aaa1-4aba-a430-4d92fe3d3c78"). InnerVolumeSpecName "kube-api-access-hx48s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.792794 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.806787 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx48s\" (UniqueName: \"kubernetes.io/projected/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-kube-api-access-hx48s\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.806816 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.806826 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.806836 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.806845 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.806853 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f678f10-aaa1-4aba-a430-4d92fe3d3c78-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:21 crc kubenswrapper[4848]: E1206 15:49:21.876958 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbceed76_344f_499a_8f86_12bcd30a2936.slice/crio-6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbceed76_344f_499a_8f86_12bcd30a2936.slice/crio-conmon-6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2.scope\": RecentStats: unable to find data in memory cache]" Dec 06 15:49:21 crc kubenswrapper[4848]: W1206 15:49:21.946608 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb51797dd_e2f2_497c_a13c_921ab2868646.slice/crio-912768f9da63a04cedac7c4c0911cfb4c8adf844a9390b9fadb0ab11deeebe13 WatchSource:0}: Error finding container 912768f9da63a04cedac7c4c0911cfb4c8adf844a9390b9fadb0ab11deeebe13: Status 404 returned error can't find the container with id 912768f9da63a04cedac7c4c0911cfb4c8adf844a9390b9fadb0ab11deeebe13 Dec 06 15:49:21 crc kubenswrapper[4848]: I1206 15:49:21.952585 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85796f7496-8zdjc"] Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.073203 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.153874 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cdcd56486-4mb97" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:48192->10.217.0.154:9311: read: connection reset by peer" Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.153899 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cdcd56486-4mb97" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:48176->10.217.0.154:9311: read: connection reset by peer" Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.212179 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-wq7q5"] Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.307154 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 15:49:22 crc kubenswrapper[4848]: W1206 15:49:22.369707 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d65a058_3a8e_4083_8689_8627bf58faab.slice/crio-e66dc49263dfee04a2585f76b1239bc86c5ed15e2b1129eb0dd793da74b6019c WatchSource:0}: Error finding container e66dc49263dfee04a2585f76b1239bc86c5ed15e2b1129eb0dd793da74b6019c: Status 404 returned error can't find the container with id e66dc49263dfee04a2585f76b1239bc86c5ed15e2b1129eb0dd793da74b6019c Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.435140 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" event={"ID":"5e6a9170-ec3f-4704-8526-7736f298a496","Type":"ContainerStarted","Data":"5f91a90dc98aa7a52ee294a65505e3b0c51b0a8f3f549a0a35870f506604f587"} Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.437870 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5600511-fa48-4bd3-98be-0e823cac69b2","Type":"ContainerStarted","Data":"f46c3cbf9a74afe37af5baa0c78f95b86f29361688fbfb01f97f0602ce62da29"} Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.438779 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec6fc2a8-0d31-49f6-98be-60e56583631c","Type":"ContainerStarted","Data":"c5aa8039fc37400349f7445e070b6edbaf27722a959a32a199ce78d5a8f3c350"} Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.440104 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d65a058-3a8e-4083-8689-8627bf58faab","Type":"ContainerStarted","Data":"e66dc49263dfee04a2585f76b1239bc86c5ed15e2b1129eb0dd793da74b6019c"} Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.443007 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85796f7496-8zdjc" event={"ID":"b51797dd-e2f2-497c-a13c-921ab2868646","Type":"ContainerStarted","Data":"912768f9da63a04cedac7c4c0911cfb4c8adf844a9390b9fadb0ab11deeebe13"} Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.445560 4848 generic.go:334] "Generic (PLEG): container finished" podID="dbceed76-344f-499a-8f86-12bcd30a2936" containerID="6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2" exitCode=143 Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.445631 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dbceed76-344f-499a-8f86-12bcd30a2936","Type":"ContainerDied","Data":"6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2"} Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.447673 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76858ffddc-pvnks" event={"ID":"6c86a3f4-dccd-48e8-9169-a63eaaded209","Type":"ContainerStarted","Data":"4f83e930cb6abd6594a6e9262a9d4948f8669244b49cd44ab3527cf17a500170"} Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.447738 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76858ffddc-pvnks" event={"ID":"6c86a3f4-dccd-48e8-9169-a63eaaded209","Type":"ContainerStarted","Data":"c8d6c70eb3291469019708b031377780259189094b47c74010c02e2e244ae6e6"} Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.448861 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.448886 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.461037 4848 generic.go:334] "Generic (PLEG): container finished" podID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerID="99e513c8ef9c3d15aa01e09162ff156ad51d7b6738fe58c478d253d9c27ea452" exitCode=0 Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.461139 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-db4wr" Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.461369 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdcd56486-4mb97" event={"ID":"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6","Type":"ContainerDied","Data":"99e513c8ef9c3d15aa01e09162ff156ad51d7b6738fe58c478d253d9c27ea452"} Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.490550 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-76858ffddc-pvnks" podStartSLOduration=9.490505964 podStartE2EDuration="9.490505964s" podCreationTimestamp="2025-12-06 15:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:22.477835841 +0000 UTC m=+1229.775846754" watchObservedRunningTime="2025-12-06 15:49:22.490505964 +0000 UTC m=+1229.788516877" Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.619811 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-db4wr"] Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.630418 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-db4wr"] Dec 06 15:49:22 crc kubenswrapper[4848]: I1206 15:49:22.983473 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f678f10-aaa1-4aba-a430-4d92fe3d3c78" path="/var/lib/kubelet/pods/4f678f10-aaa1-4aba-a430-4d92fe3d3c78/volumes" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.272151 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-5ct5k"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.273997 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5ct5k" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.311378 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.318462 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5ct5k"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.358124 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7nqw\" (UniqueName: \"kubernetes.io/projected/9af5a0b7-5219-4bf8-9e36-87218655227b-kube-api-access-l7nqw\") pod \"nova-api-db-create-5ct5k\" (UID: \"9af5a0b7-5219-4bf8-9e36-87218655227b\") " pod="openstack/nova-api-db-create-5ct5k" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.358451 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9af5a0b7-5219-4bf8-9e36-87218655227b-operator-scripts\") pod \"nova-api-db-create-5ct5k\" (UID: \"9af5a0b7-5219-4bf8-9e36-87218655227b\") " pod="openstack/nova-api-db-create-5ct5k" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.383751 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-wxr2n"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.384914 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wxr2n" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.398447 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c7d9-account-create-update-fx892"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.399631 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7d9-account-create-update-fx892" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.404493 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.413762 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wxr2n"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.424475 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c7d9-account-create-update-fx892"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.464597 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7nqw\" (UniqueName: \"kubernetes.io/projected/9af5a0b7-5219-4bf8-9e36-87218655227b-kube-api-access-l7nqw\") pod \"nova-api-db-create-5ct5k\" (UID: \"9af5a0b7-5219-4bf8-9e36-87218655227b\") " pod="openstack/nova-api-db-create-5ct5k" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.464651 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9nqj\" (UniqueName: \"kubernetes.io/projected/81599cc7-40f1-4f1c-96e5-9465b89b0517-kube-api-access-x9nqj\") pod \"nova-api-c7d9-account-create-update-fx892\" (UID: \"81599cc7-40f1-4f1c-96e5-9465b89b0517\") " pod="openstack/nova-api-c7d9-account-create-update-fx892" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.464708 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9af5a0b7-5219-4bf8-9e36-87218655227b-operator-scripts\") pod \"nova-api-db-create-5ct5k\" (UID: \"9af5a0b7-5219-4bf8-9e36-87218655227b\") " pod="openstack/nova-api-db-create-5ct5k" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.464805 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4jvj\" (UniqueName: \"kubernetes.io/projected/e3358e13-6e57-435c-ba4f-671c970019fd-kube-api-access-n4jvj\") pod \"nova-cell0-db-create-wxr2n\" (UID: \"e3358e13-6e57-435c-ba4f-671c970019fd\") " pod="openstack/nova-cell0-db-create-wxr2n" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.464846 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81599cc7-40f1-4f1c-96e5-9465b89b0517-operator-scripts\") pod \"nova-api-c7d9-account-create-update-fx892\" (UID: \"81599cc7-40f1-4f1c-96e5-9465b89b0517\") " pod="openstack/nova-api-c7d9-account-create-update-fx892" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.464941 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3358e13-6e57-435c-ba4f-671c970019fd-operator-scripts\") pod \"nova-cell0-db-create-wxr2n\" (UID: \"e3358e13-6e57-435c-ba4f-671c970019fd\") " pod="openstack/nova-cell0-db-create-wxr2n" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.469208 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9af5a0b7-5219-4bf8-9e36-87218655227b-operator-scripts\") pod \"nova-api-db-create-5ct5k\" (UID: \"9af5a0b7-5219-4bf8-9e36-87218655227b\") " pod="openstack/nova-api-db-create-5ct5k" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.491381 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7nqw\" (UniqueName: \"kubernetes.io/projected/9af5a0b7-5219-4bf8-9e36-87218655227b-kube-api-access-l7nqw\") pod \"nova-api-db-create-5ct5k\" (UID: \"9af5a0b7-5219-4bf8-9e36-87218655227b\") " pod="openstack/nova-api-db-create-5ct5k" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.567712 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81599cc7-40f1-4f1c-96e5-9465b89b0517-operator-scripts\") pod \"nova-api-c7d9-account-create-update-fx892\" (UID: \"81599cc7-40f1-4f1c-96e5-9465b89b0517\") " pod="openstack/nova-api-c7d9-account-create-update-fx892" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.568121 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3358e13-6e57-435c-ba4f-671c970019fd-operator-scripts\") pod \"nova-cell0-db-create-wxr2n\" (UID: \"e3358e13-6e57-435c-ba4f-671c970019fd\") " pod="openstack/nova-cell0-db-create-wxr2n" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.568154 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9nqj\" (UniqueName: \"kubernetes.io/projected/81599cc7-40f1-4f1c-96e5-9465b89b0517-kube-api-access-x9nqj\") pod \"nova-api-c7d9-account-create-update-fx892\" (UID: \"81599cc7-40f1-4f1c-96e5-9465b89b0517\") " pod="openstack/nova-api-c7d9-account-create-update-fx892" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.568251 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4jvj\" (UniqueName: \"kubernetes.io/projected/e3358e13-6e57-435c-ba4f-671c970019fd-kube-api-access-n4jvj\") pod \"nova-cell0-db-create-wxr2n\" (UID: \"e3358e13-6e57-435c-ba4f-671c970019fd\") " pod="openstack/nova-cell0-db-create-wxr2n" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.569202 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81599cc7-40f1-4f1c-96e5-9465b89b0517-operator-scripts\") pod \"nova-api-c7d9-account-create-update-fx892\" (UID: \"81599cc7-40f1-4f1c-96e5-9465b89b0517\") " pod="openstack/nova-api-c7d9-account-create-update-fx892" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.570058 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3358e13-6e57-435c-ba4f-671c970019fd-operator-scripts\") pod \"nova-cell0-db-create-wxr2n\" (UID: \"e3358e13-6e57-435c-ba4f-671c970019fd\") " pod="openstack/nova-cell0-db-create-wxr2n" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.576535 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d65a058-3a8e-4083-8689-8627bf58faab","Type":"ContainerStarted","Data":"03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb"} Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.597006 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-9t25c"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.598094 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4jvj\" (UniqueName: \"kubernetes.io/projected/e3358e13-6e57-435c-ba4f-671c970019fd-kube-api-access-n4jvj\") pod \"nova-cell0-db-create-wxr2n\" (UID: \"e3358e13-6e57-435c-ba4f-671c970019fd\") " pod="openstack/nova-cell0-db-create-wxr2n" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.598325 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9t25c" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.605575 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85796f7496-8zdjc" event={"ID":"b51797dd-e2f2-497c-a13c-921ab2868646","Type":"ContainerStarted","Data":"e17acfc291a003894d0b7e0c70eaebbd7a9553f630e0ad2bb75196abadc3b11a"} Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.610947 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b459-account-create-update-7d7s8"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.612218 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b459-account-create-update-7d7s8" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.614573 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.626365 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5ct5k" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.631055 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9nqj\" (UniqueName: \"kubernetes.io/projected/81599cc7-40f1-4f1c-96e5-9465b89b0517-kube-api-access-x9nqj\") pod \"nova-api-c7d9-account-create-update-fx892\" (UID: \"81599cc7-40f1-4f1c-96e5-9465b89b0517\") " pod="openstack/nova-api-c7d9-account-create-update-fx892" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.638167 4848 generic.go:334] "Generic (PLEG): container finished" podID="5e6a9170-ec3f-4704-8526-7736f298a496" containerID="190f41f79a70a579bd53ce0c19363fab0fa7aed835cabbf45296b6c04181423b" exitCode=0 Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.638253 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" event={"ID":"5e6a9170-ec3f-4704-8526-7736f298a496","Type":"ContainerDied","Data":"190f41f79a70a579bd53ce0c19363fab0fa7aed835cabbf45296b6c04181423b"} Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.650529 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9t25c"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.678347 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svvtz\" (UniqueName: \"kubernetes.io/projected/7d15aeb9-60fe-4de1-a715-7843431f9f7f-kube-api-access-svvtz\") pod \"nova-cell0-b459-account-create-update-7d7s8\" (UID: \"7d15aeb9-60fe-4de1-a715-7843431f9f7f\") " pod="openstack/nova-cell0-b459-account-create-update-7d7s8" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.678471 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d15aeb9-60fe-4de1-a715-7843431f9f7f-operator-scripts\") pod \"nova-cell0-b459-account-create-update-7d7s8\" (UID: \"7d15aeb9-60fe-4de1-a715-7843431f9f7f\") " pod="openstack/nova-cell0-b459-account-create-update-7d7s8" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.678516 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt5dx\" (UniqueName: \"kubernetes.io/projected/d9624a1f-b378-4f0a-af01-12f45f8c7694-kube-api-access-zt5dx\") pod \"nova-cell1-db-create-9t25c\" (UID: \"d9624a1f-b378-4f0a-af01-12f45f8c7694\") " pod="openstack/nova-cell1-db-create-9t25c" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.678537 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9624a1f-b378-4f0a-af01-12f45f8c7694-operator-scripts\") pod \"nova-cell1-db-create-9t25c\" (UID: \"d9624a1f-b378-4f0a-af01-12f45f8c7694\") " pod="openstack/nova-cell1-db-create-9t25c" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.688345 4848 generic.go:334] "Generic (PLEG): container finished" podID="fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" containerID="e56f0b5bbe6fd90187a3c4bf4a60a370762d5f91ff34a66306c98d05543a6f5a" exitCode=0 Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.689449 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce","Type":"ContainerDied","Data":"e56f0b5bbe6fd90187a3c4bf4a60a370762d5f91ff34a66306c98d05543a6f5a"} Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.732176 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b459-account-create-update-7d7s8"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.779975 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svvtz\" (UniqueName: \"kubernetes.io/projected/7d15aeb9-60fe-4de1-a715-7843431f9f7f-kube-api-access-svvtz\") pod \"nova-cell0-b459-account-create-update-7d7s8\" (UID: \"7d15aeb9-60fe-4de1-a715-7843431f9f7f\") " pod="openstack/nova-cell0-b459-account-create-update-7d7s8" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.780093 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d15aeb9-60fe-4de1-a715-7843431f9f7f-operator-scripts\") pod \"nova-cell0-b459-account-create-update-7d7s8\" (UID: \"7d15aeb9-60fe-4de1-a715-7843431f9f7f\") " pod="openstack/nova-cell0-b459-account-create-update-7d7s8" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.780179 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt5dx\" (UniqueName: \"kubernetes.io/projected/d9624a1f-b378-4f0a-af01-12f45f8c7694-kube-api-access-zt5dx\") pod \"nova-cell1-db-create-9t25c\" (UID: \"d9624a1f-b378-4f0a-af01-12f45f8c7694\") " pod="openstack/nova-cell1-db-create-9t25c" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.780208 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9624a1f-b378-4f0a-af01-12f45f8c7694-operator-scripts\") pod \"nova-cell1-db-create-9t25c\" (UID: \"d9624a1f-b378-4f0a-af01-12f45f8c7694\") " pod="openstack/nova-cell1-db-create-9t25c" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.781008 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d15aeb9-60fe-4de1-a715-7843431f9f7f-operator-scripts\") pod \"nova-cell0-b459-account-create-update-7d7s8\" (UID: \"7d15aeb9-60fe-4de1-a715-7843431f9f7f\") " pod="openstack/nova-cell0-b459-account-create-update-7d7s8" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.782040 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9624a1f-b378-4f0a-af01-12f45f8c7694-operator-scripts\") pod \"nova-cell1-db-create-9t25c\" (UID: \"d9624a1f-b378-4f0a-af01-12f45f8c7694\") " pod="openstack/nova-cell1-db-create-9t25c" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.814290 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7d9-account-create-update-fx892" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.820045 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt5dx\" (UniqueName: \"kubernetes.io/projected/d9624a1f-b378-4f0a-af01-12f45f8c7694-kube-api-access-zt5dx\") pod \"nova-cell1-db-create-9t25c\" (UID: \"d9624a1f-b378-4f0a-af01-12f45f8c7694\") " pod="openstack/nova-cell1-db-create-9t25c" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.826486 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9t25c" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.835219 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svvtz\" (UniqueName: \"kubernetes.io/projected/7d15aeb9-60fe-4de1-a715-7843431f9f7f-kube-api-access-svvtz\") pod \"nova-cell0-b459-account-create-update-7d7s8\" (UID: \"7d15aeb9-60fe-4de1-a715-7843431f9f7f\") " pod="openstack/nova-cell0-b459-account-create-update-7d7s8" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.898128 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wxr2n" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.916877 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b459-account-create-update-7d7s8" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.928650 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8c96-account-create-update-r6sqf"] Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.968426 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.971531 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 06 15:49:23 crc kubenswrapper[4848]: I1206 15:49:23.990250 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.081803 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8c96-account-create-update-r6sqf"] Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.093914 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msldt\" (UniqueName: \"kubernetes.io/projected/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-kube-api-access-msldt\") pod \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.093988 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data-custom\") pod \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.094170 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-logs\") pod \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.094187 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data\") pod \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.094218 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-combined-ca-bundle\") pod \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\" (UID: \"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6\") " Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.094719 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h77f\" (UniqueName: \"kubernetes.io/projected/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-kube-api-access-2h77f\") pod \"nova-cell1-8c96-account-create-update-r6sqf\" (UID: \"f4ae7cfb-1f4e-4947-94c8-d358f5e36476\") " pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.094757 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-operator-scripts\") pod \"nova-cell1-8c96-account-create-update-r6sqf\" (UID: \"f4ae7cfb-1f4e-4947-94c8-d358f5e36476\") " pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.099047 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-logs" (OuterVolumeSpecName: "logs") pod "c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" (UID: "c82eedfe-54ed-4fc6-9050-eee7f1eb39d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.105124 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-kube-api-access-msldt" (OuterVolumeSpecName: "kube-api-access-msldt") pod "c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" (UID: "c82eedfe-54ed-4fc6-9050-eee7f1eb39d6"). InnerVolumeSpecName "kube-api-access-msldt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.109885 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" (UID: "c82eedfe-54ed-4fc6-9050-eee7f1eb39d6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.149132 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" (UID: "c82eedfe-54ed-4fc6-9050-eee7f1eb39d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.179433 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data" (OuterVolumeSpecName: "config-data") pod "c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" (UID: "c82eedfe-54ed-4fc6-9050-eee7f1eb39d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.198050 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h77f\" (UniqueName: \"kubernetes.io/projected/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-kube-api-access-2h77f\") pod \"nova-cell1-8c96-account-create-update-r6sqf\" (UID: \"f4ae7cfb-1f4e-4947-94c8-d358f5e36476\") " pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.198093 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-operator-scripts\") pod \"nova-cell1-8c96-account-create-update-r6sqf\" (UID: \"f4ae7cfb-1f4e-4947-94c8-d358f5e36476\") " pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.198169 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.198180 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.198189 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.198199 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msldt\" (UniqueName: \"kubernetes.io/projected/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-kube-api-access-msldt\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.198207 4848 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.198771 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-operator-scripts\") pod \"nova-cell1-8c96-account-create-update-r6sqf\" (UID: \"f4ae7cfb-1f4e-4947-94c8-d358f5e36476\") " pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.216657 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h77f\" (UniqueName: \"kubernetes.io/projected/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-kube-api-access-2h77f\") pod \"nova-cell1-8c96-account-create-update-r6sqf\" (UID: \"f4ae7cfb-1f4e-4947-94c8-d358f5e36476\") " pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.324255 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.453771 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5ct5k"] Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.651895 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c7d9-account-create-update-fx892"] Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.765036 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cdcd56486-4mb97" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.768450 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdcd56486-4mb97" event={"ID":"c82eedfe-54ed-4fc6-9050-eee7f1eb39d6","Type":"ContainerDied","Data":"3d931f5c9c9a5eea838edccf4df2095a0e3ce4ab2391a3787d5bf30fafeb10bf"} Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.768493 4848 scope.go:117] "RemoveContainer" containerID="99e513c8ef9c3d15aa01e09162ff156ad51d7b6738fe58c478d253d9c27ea452" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.786042 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5600511-fa48-4bd3-98be-0e823cac69b2","Type":"ContainerStarted","Data":"461fff830f900e9841f5039799776ca4845958258b2970d2b68d2ed071f5145b"} Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.791820 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce","Type":"ContainerDied","Data":"bf849cb99bef90fb2f53e1c045acb624718ab668ee73f18e1970562eed5c59fe"} Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.791853 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf849cb99bef90fb2f53e1c045acb624718ab668ee73f18e1970562eed5c59fe" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.793377 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5ct5k" event={"ID":"9af5a0b7-5219-4bf8-9e36-87218655227b","Type":"ContainerStarted","Data":"d9e19266172e47e31020c860530f748f1c3c9ccc0a264aea2da1b21762d38024"} Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.798185 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85796f7496-8zdjc" event={"ID":"b51797dd-e2f2-497c-a13c-921ab2868646","Type":"ContainerStarted","Data":"3b6171b08d63c284cb3b979d9307061c12a1098ffb17f2d08e2487a26fe23f44"} Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.798232 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.825983 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-85796f7496-8zdjc" podStartSLOduration=4.825962721 podStartE2EDuration="4.825962721s" podCreationTimestamp="2025-12-06 15:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:24.816066153 +0000 UTC m=+1232.114077066" watchObservedRunningTime="2025-12-06 15:49:24.825962721 +0000 UTC m=+1232.123973634" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.878310 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="dbceed76-344f-499a-8f86-12bcd30a2936" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.146:9292/healthcheck\": dial tcp 10.217.0.146:9292: connect: connection refused" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.879518 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="dbceed76-344f-499a-8f86-12bcd30a2936" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.146:9292/healthcheck\": dial tcp 10.217.0.146:9292: connect: connection refused" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.896342 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.913595 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cdcd56486-4mb97"] Dec 06 15:49:24 crc kubenswrapper[4848]: I1206 15:49:24.942811 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6cdcd56486-4mb97"] Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.008314 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" path="/var/lib/kubelet/pods/c82eedfe-54ed-4fc6-9050-eee7f1eb39d6/volumes" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.041751 4848 scope.go:117] "RemoveContainer" containerID="ca978b1092cf0b8ab8bc5fb6a3ddb8ca89c85d05ff505058d78e15218334e4de" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.048362 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-httpd-run\") pod \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.048404 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfkzt\" (UniqueName: \"kubernetes.io/projected/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-kube-api-access-zfkzt\") pod \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.048451 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-logs\") pod \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.048512 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-combined-ca-bundle\") pod \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.048547 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-scripts\") pod \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.048608 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-public-tls-certs\") pod \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.048663 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.048747 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-config-data\") pod \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\" (UID: \"fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.050353 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-logs" (OuterVolumeSpecName: "logs") pod "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" (UID: "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.053425 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" (UID: "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.064448 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" (UID: "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.067445 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-scripts" (OuterVolumeSpecName: "scripts") pod "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" (UID: "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.071281 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-kube-api-access-zfkzt" (OuterVolumeSpecName: "kube-api-access-zfkzt") pod "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" (UID: "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce"). InnerVolumeSpecName "kube-api-access-zfkzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.137145 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9t25c"] Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.173412 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.173728 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.173745 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.173754 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfkzt\" (UniqueName: \"kubernetes.io/projected/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-kube-api-access-zfkzt\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.173767 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:25 crc kubenswrapper[4848]: W1206 15:49:25.195537 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9624a1f_b378_4f0a_af01_12f45f8c7694.slice/crio-d477ebe9cf9bc66a7f1685093708f92f1c4a9605b2dd930f05f38ffddb34485d WatchSource:0}: Error finding container d477ebe9cf9bc66a7f1685093708f92f1c4a9605b2dd930f05f38ffddb34485d: Status 404 returned error can't find the container with id d477ebe9cf9bc66a7f1685093708f92f1c4a9605b2dd930f05f38ffddb34485d Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.340023 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.346122 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" (UID: "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.381835 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.382186 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.436481 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" (UID: "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.444372 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-config-data" (OuterVolumeSpecName: "config-data") pod "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" (UID: "fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.486252 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.486283 4848 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.697412 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wxr2n"] Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.717091 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8c96-account-create-update-r6sqf"] Dec 06 15:49:25 crc kubenswrapper[4848]: W1206 15:49:25.717830 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3358e13_6e57_435c_ba4f_671c970019fd.slice/crio-e3f6185f198219f5bedbffe580e5e8324ef882c3843e4f685892c5776ef0ae10 WatchSource:0}: Error finding container e3f6185f198219f5bedbffe580e5e8324ef882c3843e4f685892c5776ef0ae10: Status 404 returned error can't find the container with id e3f6185f198219f5bedbffe580e5e8324ef882c3843e4f685892c5776ef0ae10 Dec 06 15:49:25 crc kubenswrapper[4848]: W1206 15:49:25.738144 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4ae7cfb_1f4e_4947_94c8_d358f5e36476.slice/crio-2c851c06d7c548464ce169820e4bf26326bf1387b45b539310ddc6f7034f9d0f WatchSource:0}: Error finding container 2c851c06d7c548464ce169820e4bf26326bf1387b45b539310ddc6f7034f9d0f: Status 404 returned error can't find the container with id 2c851c06d7c548464ce169820e4bf26326bf1387b45b539310ddc6f7034f9d0f Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.752150 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b459-account-create-update-7d7s8"] Dec 06 15:49:25 crc kubenswrapper[4848]: W1206 15:49:25.773981 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d15aeb9_60fe_4de1_a715_7843431f9f7f.slice/crio-d42afee2445df551cb7e3d2f01295ec48a9544fff088bfcc391bacaf752b0b30 WatchSource:0}: Error finding container d42afee2445df551cb7e3d2f01295ec48a9544fff088bfcc391bacaf752b0b30: Status 404 returned error can't find the container with id d42afee2445df551cb7e3d2f01295ec48a9544fff088bfcc391bacaf752b0b30 Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.793217 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.849152 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d65a058-3a8e-4083-8689-8627bf58faab","Type":"ContainerStarted","Data":"011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04"} Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.849299 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3d65a058-3a8e-4083-8689-8627bf58faab" containerName="cinder-api-log" containerID="cri-o://03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb" gracePeriod=30 Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.849547 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.849795 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3d65a058-3a8e-4083-8689-8627bf58faab" containerName="cinder-api" containerID="cri-o://011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04" gracePeriod=30 Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.869015 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7d9-account-create-update-fx892" event={"ID":"81599cc7-40f1-4f1c-96e5-9465b89b0517","Type":"ContainerStarted","Data":"b4857ac3c41c0b1802966e2f124cb66a8faedaa7d83295efed37e6c6b472aa46"} Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.869060 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7d9-account-create-update-fx892" event={"ID":"81599cc7-40f1-4f1c-96e5-9465b89b0517","Type":"ContainerStarted","Data":"c4d8c1e306b30722090a9bfc31e5eae81cc5f19a2f94fd4e7fb08f83b1d61e72"} Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.880183 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.880159073 podStartE2EDuration="4.880159073s" podCreationTimestamp="2025-12-06 15:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:25.872032283 +0000 UTC m=+1233.170043206" watchObservedRunningTime="2025-12-06 15:49:25.880159073 +0000 UTC m=+1233.178169986" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.884921 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wxr2n" event={"ID":"e3358e13-6e57-435c-ba4f-671c970019fd","Type":"ContainerStarted","Data":"e3f6185f198219f5bedbffe580e5e8324ef882c3843e4f685892c5776ef0ae10"} Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.904106 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-internal-tls-certs\") pod \"dbceed76-344f-499a-8f86-12bcd30a2936\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.904239 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-scripts\") pod \"dbceed76-344f-499a-8f86-12bcd30a2936\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.904432 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-config-data\") pod \"dbceed76-344f-499a-8f86-12bcd30a2936\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.904516 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-logs\") pod \"dbceed76-344f-499a-8f86-12bcd30a2936\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.904562 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-combined-ca-bundle\") pod \"dbceed76-344f-499a-8f86-12bcd30a2936\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.904592 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"dbceed76-344f-499a-8f86-12bcd30a2936\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.904643 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-httpd-run\") pod \"dbceed76-344f-499a-8f86-12bcd30a2936\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.904686 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl2ps\" (UniqueName: \"kubernetes.io/projected/dbceed76-344f-499a-8f86-12bcd30a2936-kube-api-access-kl2ps\") pod \"dbceed76-344f-499a-8f86-12bcd30a2936\" (UID: \"dbceed76-344f-499a-8f86-12bcd30a2936\") " Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.906666 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-logs" (OuterVolumeSpecName: "logs") pod "dbceed76-344f-499a-8f86-12bcd30a2936" (UID: "dbceed76-344f-499a-8f86-12bcd30a2936"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.906883 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-c7d9-account-create-update-fx892" podStartSLOduration=2.906863286 podStartE2EDuration="2.906863286s" podCreationTimestamp="2025-12-06 15:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:25.903978408 +0000 UTC m=+1233.201989321" watchObservedRunningTime="2025-12-06 15:49:25.906863286 +0000 UTC m=+1233.204874199" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.913632 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dbceed76-344f-499a-8f86-12bcd30a2936" (UID: "dbceed76-344f-499a-8f86-12bcd30a2936"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.914297 4848 generic.go:334] "Generic (PLEG): container finished" podID="dbceed76-344f-499a-8f86-12bcd30a2936" containerID="76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3" exitCode=0 Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.914505 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.916415 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dbceed76-344f-499a-8f86-12bcd30a2936","Type":"ContainerDied","Data":"76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3"} Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.916472 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dbceed76-344f-499a-8f86-12bcd30a2936","Type":"ContainerDied","Data":"e37081cd8e0261cba52cfac12f36a0a640aa9b5d1b6e38876b13e3f3d1da98d6"} Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.916525 4848 scope.go:117] "RemoveContainer" containerID="76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.939256 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-scripts" (OuterVolumeSpecName: "scripts") pod "dbceed76-344f-499a-8f86-12bcd30a2936" (UID: "dbceed76-344f-499a-8f86-12bcd30a2936"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.945342 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "dbceed76-344f-499a-8f86-12bcd30a2936" (UID: "dbceed76-344f-499a-8f86-12bcd30a2936"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.951675 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9t25c" event={"ID":"d9624a1f-b378-4f0a-af01-12f45f8c7694","Type":"ContainerStarted","Data":"5fd6129be1335ee4ff05e8297387246755952393d2aabd6d7348ca9d3217ced0"} Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.951745 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9t25c" event={"ID":"d9624a1f-b378-4f0a-af01-12f45f8c7694","Type":"ContainerStarted","Data":"d477ebe9cf9bc66a7f1685093708f92f1c4a9605b2dd930f05f38ffddb34485d"} Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.971934 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbceed76-344f-499a-8f86-12bcd30a2936-kube-api-access-kl2ps" (OuterVolumeSpecName: "kube-api-access-kl2ps") pod "dbceed76-344f-499a-8f86-12bcd30a2936" (UID: "dbceed76-344f-499a-8f86-12bcd30a2936"). InnerVolumeSpecName "kube-api-access-kl2ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.972867 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" event={"ID":"f4ae7cfb-1f4e-4947-94c8-d358f5e36476","Type":"ContainerStarted","Data":"2c851c06d7c548464ce169820e4bf26326bf1387b45b539310ddc6f7034f9d0f"} Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.987631 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" event={"ID":"5e6a9170-ec3f-4704-8526-7736f298a496","Type":"ContainerStarted","Data":"90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428"} Dec 06 15:49:25 crc kubenswrapper[4848]: I1206 15:49:25.989870 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.000998 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-9t25c" podStartSLOduration=3.000979932 podStartE2EDuration="3.000979932s" podCreationTimestamp="2025-12-06 15:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:25.973421756 +0000 UTC m=+1233.271432669" watchObservedRunningTime="2025-12-06 15:49:26.000979932 +0000 UTC m=+1233.298990845" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.010399 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.010447 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.010460 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbceed76-344f-499a-8f86-12bcd30a2936-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.010475 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl2ps\" (UniqueName: \"kubernetes.io/projected/dbceed76-344f-499a-8f86-12bcd30a2936-kube-api-access-kl2ps\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.010489 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.013243 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5ct5k" event={"ID":"9af5a0b7-5219-4bf8-9e36-87218655227b","Type":"ContainerStarted","Data":"b0fa0e34a4343cdfaadf196b3ac94cefd6cf6be7b1b969f0308f150eb55e6555"} Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.017457 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b459-account-create-update-7d7s8" event={"ID":"7d15aeb9-60fe-4de1-a715-7843431f9f7f","Type":"ContainerStarted","Data":"d42afee2445df551cb7e3d2f01295ec48a9544fff088bfcc391bacaf752b0b30"} Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.017556 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.063954 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" podStartSLOduration=5.063929505 podStartE2EDuration="5.063929505s" podCreationTimestamp="2025-12-06 15:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:26.03124208 +0000 UTC m=+1233.329253003" watchObservedRunningTime="2025-12-06 15:49:26.063929505 +0000 UTC m=+1233.361940418" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.099908 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbceed76-344f-499a-8f86-12bcd30a2936" (UID: "dbceed76-344f-499a-8f86-12bcd30a2936"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.114208 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.118924 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.131483 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.145026 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.146813 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-config-data" (OuterVolumeSpecName: "config-data") pod "dbceed76-344f-499a-8f86-12bcd30a2936" (UID: "dbceed76-344f-499a-8f86-12bcd30a2936"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.172916 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:49:26 crc kubenswrapper[4848]: E1206 15:49:26.173606 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbceed76-344f-499a-8f86-12bcd30a2936" containerName="glance-log" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.173734 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbceed76-344f-499a-8f86-12bcd30a2936" containerName="glance-log" Dec 06 15:49:26 crc kubenswrapper[4848]: E1206 15:49:26.173833 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" containerName="glance-log" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.173916 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" containerName="glance-log" Dec 06 15:49:26 crc kubenswrapper[4848]: E1206 15:49:26.174007 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" containerName="glance-httpd" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.174086 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" containerName="glance-httpd" Dec 06 15:49:26 crc kubenswrapper[4848]: E1206 15:49:26.174173 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.174241 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api" Dec 06 15:49:26 crc kubenswrapper[4848]: E1206 15:49:26.174327 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbceed76-344f-499a-8f86-12bcd30a2936" containerName="glance-httpd" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.174394 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbceed76-344f-499a-8f86-12bcd30a2936" containerName="glance-httpd" Dec 06 15:49:26 crc kubenswrapper[4848]: E1206 15:49:26.174493 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api-log" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.174571 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api-log" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.176354 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbceed76-344f-499a-8f86-12bcd30a2936" containerName="glance-log" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.176459 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" containerName="glance-httpd" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.176552 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbceed76-344f-499a-8f86-12bcd30a2936" containerName="glance-httpd" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.176638 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" containerName="glance-log" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.176791 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api-log" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.176897 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c82eedfe-54ed-4fc6-9050-eee7f1eb39d6" containerName="barbican-api" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.178108 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.182303 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.183130 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.204217 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dbceed76-344f-499a-8f86-12bcd30a2936" (UID: "dbceed76-344f-499a-8f86-12bcd30a2936"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.215925 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.217026 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.217249 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.217277 4848 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbceed76-344f-499a-8f86-12bcd30a2936-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.225622 4848 scope.go:117] "RemoveContainer" containerID="6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.306839 4848 scope.go:117] "RemoveContainer" containerID="76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3" Dec 06 15:49:26 crc kubenswrapper[4848]: E1206 15:49:26.307527 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3\": container with ID starting with 76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3 not found: ID does not exist" containerID="76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.307562 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3"} err="failed to get container status \"76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3\": rpc error: code = NotFound desc = could not find container \"76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3\": container with ID starting with 76baf67732dfa1074a1d377d403f088cf39d7c8a29084b2548ee2a61d563f7e3 not found: ID does not exist" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.307588 4848 scope.go:117] "RemoveContainer" containerID="6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2" Dec 06 15:49:26 crc kubenswrapper[4848]: E1206 15:49:26.308001 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2\": container with ID starting with 6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2 not found: ID does not exist" containerID="6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.308023 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2"} err="failed to get container status \"6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2\": rpc error: code = NotFound desc = could not find container \"6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2\": container with ID starting with 6f71487eff07a8919113ebe390c4cb5d8da6e8b27c53cca02db0bbf28281cce2 not found: ID does not exist" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.320772 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.320849 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f14ea76-e339-4e47-9063-898de1d2fac8-logs\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.320922 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.320967 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfs2d\" (UniqueName: \"kubernetes.io/projected/9f14ea76-e339-4e47-9063-898de1d2fac8-kube-api-access-pfs2d\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.321024 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f14ea76-e339-4e47-9063-898de1d2fac8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.321059 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.321095 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.321147 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.422489 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.425797 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f14ea76-e339-4e47-9063-898de1d2fac8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.427226 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.427249 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f14ea76-e339-4e47-9063-898de1d2fac8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.427280 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.427358 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.427419 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.427507 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f14ea76-e339-4e47-9063-898de1d2fac8-logs\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.427631 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.427727 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfs2d\" (UniqueName: \"kubernetes.io/projected/9f14ea76-e339-4e47-9063-898de1d2fac8-kube-api-access-pfs2d\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.436132 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f14ea76-e339-4e47-9063-898de1d2fac8-logs\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.436416 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.438153 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.440840 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.441445 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.441775 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.451256 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f14ea76-e339-4e47-9063-898de1d2fac8-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.456237 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.458250 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfs2d\" (UniqueName: \"kubernetes.io/projected/9f14ea76-e339-4e47-9063-898de1d2fac8-kube-api-access-pfs2d\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.458288 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.467392 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.467679 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.473255 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.565236 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"9f14ea76-e339-4e47-9063-898de1d2fac8\") " pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.632847 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.633095 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.633117 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.633134 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08ab771a-21e5-4145-8954-8ac8c039a8c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.633165 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t49vg\" (UniqueName: \"kubernetes.io/projected/08ab771a-21e5-4145-8954-8ac8c039a8c4-kube-api-access-t49vg\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.633271 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.633308 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.633325 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08ab771a-21e5-4145-8954-8ac8c039a8c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.734915 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.734961 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08ab771a-21e5-4145-8954-8ac8c039a8c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.735037 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.735057 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.735079 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.735094 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08ab771a-21e5-4145-8954-8ac8c039a8c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.735121 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t49vg\" (UniqueName: \"kubernetes.io/projected/08ab771a-21e5-4145-8954-8ac8c039a8c4-kube-api-access-t49vg\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.735189 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.737076 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.738101 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08ab771a-21e5-4145-8954-8ac8c039a8c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.738432 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08ab771a-21e5-4145-8954-8ac8c039a8c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.744893 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.745247 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.746891 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.760906 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ab771a-21e5-4145-8954-8ac8c039a8c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.781895 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t49vg\" (UniqueName: \"kubernetes.io/projected/08ab771a-21e5-4145-8954-8ac8c039a8c4-kube-api-access-t49vg\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.791066 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"08ab771a-21e5-4145-8954-8ac8c039a8c4\") " pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.806192 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.895016 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.938209 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.948245 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data\") pod \"3d65a058-3a8e-4083-8689-8627bf58faab\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.949822 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgrtx\" (UniqueName: \"kubernetes.io/projected/3d65a058-3a8e-4083-8689-8627bf58faab-kube-api-access-hgrtx\") pod \"3d65a058-3a8e-4083-8689-8627bf58faab\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.949869 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data-custom\") pod \"3d65a058-3a8e-4083-8689-8627bf58faab\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.949895 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d65a058-3a8e-4083-8689-8627bf58faab-etc-machine-id\") pod \"3d65a058-3a8e-4083-8689-8627bf58faab\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.949929 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-combined-ca-bundle\") pod \"3d65a058-3a8e-4083-8689-8627bf58faab\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.950102 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d65a058-3a8e-4083-8689-8627bf58faab-logs\") pod \"3d65a058-3a8e-4083-8689-8627bf58faab\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.950147 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-scripts\") pod \"3d65a058-3a8e-4083-8689-8627bf58faab\" (UID: \"3d65a058-3a8e-4083-8689-8627bf58faab\") " Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.950284 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d65a058-3a8e-4083-8689-8627bf58faab-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3d65a058-3a8e-4083-8689-8627bf58faab" (UID: "3d65a058-3a8e-4083-8689-8627bf58faab"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.950763 4848 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d65a058-3a8e-4083-8689-8627bf58faab-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.960378 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-scripts" (OuterVolumeSpecName: "scripts") pod "3d65a058-3a8e-4083-8689-8627bf58faab" (UID: "3d65a058-3a8e-4083-8689-8627bf58faab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.960667 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d65a058-3a8e-4083-8689-8627bf58faab-logs" (OuterVolumeSpecName: "logs") pod "3d65a058-3a8e-4083-8689-8627bf58faab" (UID: "3d65a058-3a8e-4083-8689-8627bf58faab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:49:26 crc kubenswrapper[4848]: I1206 15:49:26.976966 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3d65a058-3a8e-4083-8689-8627bf58faab" (UID: "3d65a058-3a8e-4083-8689-8627bf58faab"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.032063 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d65a058-3a8e-4083-8689-8627bf58faab-kube-api-access-hgrtx" (OuterVolumeSpecName: "kube-api-access-hgrtx") pod "3d65a058-3a8e-4083-8689-8627bf58faab" (UID: "3d65a058-3a8e-4083-8689-8627bf58faab"). InnerVolumeSpecName "kube-api-access-hgrtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.038574 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbceed76-344f-499a-8f86-12bcd30a2936" path="/var/lib/kubelet/pods/dbceed76-344f-499a-8f86-12bcd30a2936/volumes" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.039715 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce" path="/var/lib/kubelet/pods/fbc75ac1-4e57-44f6-b2cb-0ef3bf1385ce/volumes" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.064642 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d65a058-3a8e-4083-8689-8627bf58faab-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.064684 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.064731 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgrtx\" (UniqueName: \"kubernetes.io/projected/3d65a058-3a8e-4083-8689-8627bf58faab-kube-api-access-hgrtx\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.064748 4848 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.082976 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data" (OuterVolumeSpecName: "config-data") pod "3d65a058-3a8e-4083-8689-8627bf58faab" (UID: "3d65a058-3a8e-4083-8689-8627bf58faab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.083935 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d65a058-3a8e-4083-8689-8627bf58faab" (UID: "3d65a058-3a8e-4083-8689-8627bf58faab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.139011 4848 generic.go:334] "Generic (PLEG): container finished" podID="9af5a0b7-5219-4bf8-9e36-87218655227b" containerID="b0fa0e34a4343cdfaadf196b3ac94cefd6cf6be7b1b969f0308f150eb55e6555" exitCode=0 Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.139280 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5ct5k" event={"ID":"9af5a0b7-5219-4bf8-9e36-87218655227b","Type":"ContainerDied","Data":"b0fa0e34a4343cdfaadf196b3ac94cefd6cf6be7b1b969f0308f150eb55e6555"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.148376 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b459-account-create-update-7d7s8" event={"ID":"7d15aeb9-60fe-4de1-a715-7843431f9f7f","Type":"ContainerStarted","Data":"725cf6c1857963b710d76c9bc2b2ed1a2af1d1e43ab68b22b140a0741d3cc914"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.166340 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.166368 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d65a058-3a8e-4083-8689-8627bf58faab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.174251 4848 generic.go:334] "Generic (PLEG): container finished" podID="3d65a058-3a8e-4083-8689-8627bf58faab" containerID="011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04" exitCode=0 Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.174285 4848 generic.go:334] "Generic (PLEG): container finished" podID="3d65a058-3a8e-4083-8689-8627bf58faab" containerID="03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb" exitCode=143 Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.174362 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d65a058-3a8e-4083-8689-8627bf58faab","Type":"ContainerDied","Data":"011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.174395 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d65a058-3a8e-4083-8689-8627bf58faab","Type":"ContainerDied","Data":"03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.174410 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d65a058-3a8e-4083-8689-8627bf58faab","Type":"ContainerDied","Data":"e66dc49263dfee04a2585f76b1239bc86c5ed15e2b1129eb0dd793da74b6019c"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.174428 4848 scope.go:117] "RemoveContainer" containerID="011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.174567 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.178222 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-b459-account-create-update-7d7s8" podStartSLOduration=4.178203423 podStartE2EDuration="4.178203423s" podCreationTimestamp="2025-12-06 15:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:27.173540936 +0000 UTC m=+1234.471551849" watchObservedRunningTime="2025-12-06 15:49:27.178203423 +0000 UTC m=+1234.476214336" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.201791 4848 generic.go:334] "Generic (PLEG): container finished" podID="d9624a1f-b378-4f0a-af01-12f45f8c7694" containerID="5fd6129be1335ee4ff05e8297387246755952393d2aabd6d7348ca9d3217ced0" exitCode=0 Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.201927 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9t25c" event={"ID":"d9624a1f-b378-4f0a-af01-12f45f8c7694","Type":"ContainerDied","Data":"5fd6129be1335ee4ff05e8297387246755952393d2aabd6d7348ca9d3217ced0"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.205027 4848 generic.go:334] "Generic (PLEG): container finished" podID="e3358e13-6e57-435c-ba4f-671c970019fd" containerID="737e921d23414808f8aeb148a7ce3b5ebadcebc6c176c774b58b32a428cbb62e" exitCode=0 Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.205215 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wxr2n" event={"ID":"e3358e13-6e57-435c-ba4f-671c970019fd","Type":"ContainerDied","Data":"737e921d23414808f8aeb148a7ce3b5ebadcebc6c176c774b58b32a428cbb62e"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.224678 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" event={"ID":"f4ae7cfb-1f4e-4947-94c8-d358f5e36476","Type":"ContainerStarted","Data":"7c693a591a909a1baedb6df1ea688682e5b6abacaf3c4cde2b528e2328a4e385"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.249757 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec6fc2a8-0d31-49f6-98be-60e56583631c","Type":"ContainerStarted","Data":"b2efa2e5aa05abf13e893ea4a3903cac7e2ba3d32b0068de4b60c7c28e364a6f"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.272183 4848 generic.go:334] "Generic (PLEG): container finished" podID="a75f41ed-628b-4e88-8d67-ada299f1c7a9" containerID="26fd3b3245be8c11ead1dc1a59a18cf3fa6e34c5f6b7ed5999506338d06afbb3" exitCode=0 Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.272271 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-7xkpk" event={"ID":"a75f41ed-628b-4e88-8d67-ada299f1c7a9","Type":"ContainerDied","Data":"26fd3b3245be8c11ead1dc1a59a18cf3fa6e34c5f6b7ed5999506338d06afbb3"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.286909 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5600511-fa48-4bd3-98be-0e823cac69b2","Type":"ContainerStarted","Data":"8c0e142bdef49cd5e2663be8c1ca22e6a36d0866224f6a9ca9065b17aad9e042"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.289726 4848 generic.go:334] "Generic (PLEG): container finished" podID="81599cc7-40f1-4f1c-96e5-9465b89b0517" containerID="b4857ac3c41c0b1802966e2f124cb66a8faedaa7d83295efed37e6c6b472aa46" exitCode=0 Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.289833 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7d9-account-create-update-fx892" event={"ID":"81599cc7-40f1-4f1c-96e5-9465b89b0517","Type":"ContainerDied","Data":"b4857ac3c41c0b1802966e2f124cb66a8faedaa7d83295efed37e6c6b472aa46"} Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.345087 4848 scope.go:117] "RemoveContainer" containerID="03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.357990 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" podStartSLOduration=4.357964876 podStartE2EDuration="4.357964876s" podCreationTimestamp="2025-12-06 15:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:27.279982486 +0000 UTC m=+1234.577993409" watchObservedRunningTime="2025-12-06 15:49:27.357964876 +0000 UTC m=+1234.655975799" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.395000 4848 scope.go:117] "RemoveContainer" containerID="011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04" Dec 06 15:49:27 crc kubenswrapper[4848]: E1206 15:49:27.398979 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04\": container with ID starting with 011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04 not found: ID does not exist" containerID="011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.399018 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04"} err="failed to get container status \"011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04\": rpc error: code = NotFound desc = could not find container \"011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04\": container with ID starting with 011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04 not found: ID does not exist" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.399039 4848 scope.go:117] "RemoveContainer" containerID="03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.399466 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 15:49:27 crc kubenswrapper[4848]: E1206 15:49:27.399802 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb\": container with ID starting with 03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb not found: ID does not exist" containerID="03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.399827 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb"} err="failed to get container status \"03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb\": rpc error: code = NotFound desc = could not find container \"03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb\": container with ID starting with 03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb not found: ID does not exist" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.399840 4848 scope.go:117] "RemoveContainer" containerID="011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.400796 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04"} err="failed to get container status \"011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04\": rpc error: code = NotFound desc = could not find container \"011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04\": container with ID starting with 011566a6a69f9d8068275624d656bc0f8bcfe94945a3100dfaaa42f4f4639b04 not found: ID does not exist" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.400814 4848 scope.go:117] "RemoveContainer" containerID="03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.401027 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb"} err="failed to get container status \"03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb\": rpc error: code = NotFound desc = could not find container \"03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb\": container with ID starting with 03bcce3a8bc9af9d39ff8e8f165ed1aaa1c354479019fe036d61b6be3c77e1bb not found: ID does not exist" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.408788 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.416165 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 15:49:27 crc kubenswrapper[4848]: E1206 15:49:27.416604 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d65a058-3a8e-4083-8689-8627bf58faab" containerName="cinder-api" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.416624 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d65a058-3a8e-4083-8689-8627bf58faab" containerName="cinder-api" Dec 06 15:49:27 crc kubenswrapper[4848]: E1206 15:49:27.416678 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d65a058-3a8e-4083-8689-8627bf58faab" containerName="cinder-api-log" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.416685 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d65a058-3a8e-4083-8689-8627bf58faab" containerName="cinder-api-log" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.416904 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d65a058-3a8e-4083-8689-8627bf58faab" containerName="cinder-api-log" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.416926 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d65a058-3a8e-4083-8689-8627bf58faab" containerName="cinder-api" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.418079 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.421985 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.422222 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.423568 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.423981 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.474004 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8456\" (UniqueName: \"kubernetes.io/projected/61e60c86-1fae-4b73-9c2c-bb5bdd108630-kube-api-access-f8456\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.474078 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61e60c86-1fae-4b73-9c2c-bb5bdd108630-etc-machine-id\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.474098 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e60c86-1fae-4b73-9c2c-bb5bdd108630-logs\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.474118 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.474142 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.474169 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-public-tls-certs\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.474219 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-config-data-custom\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.474238 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-config-data\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.474294 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-scripts\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.543646 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.582525 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e60c86-1fae-4b73-9c2c-bb5bdd108630-logs\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.582560 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61e60c86-1fae-4b73-9c2c-bb5bdd108630-etc-machine-id\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.582578 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.582602 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.582631 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-public-tls-certs\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.582679 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-config-data-custom\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.582717 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-config-data\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.582772 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-scripts\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.582808 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8456\" (UniqueName: \"kubernetes.io/projected/61e60c86-1fae-4b73-9c2c-bb5bdd108630-kube-api-access-f8456\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.583637 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e60c86-1fae-4b73-9c2c-bb5bdd108630-logs\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.591613 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-config-data-custom\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.592818 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61e60c86-1fae-4b73-9c2c-bb5bdd108630-etc-machine-id\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.594090 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-public-tls-certs\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.596743 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-scripts\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.599927 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.602150 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8456\" (UniqueName: \"kubernetes.io/projected/61e60c86-1fae-4b73-9c2c-bb5bdd108630-kube-api-access-f8456\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.604515 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-config-data\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.608256 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e60c86-1fae-4b73-9c2c-bb5bdd108630-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"61e60c86-1fae-4b73-9c2c-bb5bdd108630\") " pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.691329 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.772162 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b7d47d5c9-wf778"] Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.773762 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.779116 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.779429 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.785529 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b7d47d5c9-wf778"] Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.796828 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.841095 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5ct5k" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.889556 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7nqw\" (UniqueName: \"kubernetes.io/projected/9af5a0b7-5219-4bf8-9e36-87218655227b-kube-api-access-l7nqw\") pod \"9af5a0b7-5219-4bf8-9e36-87218655227b\" (UID: \"9af5a0b7-5219-4bf8-9e36-87218655227b\") " Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.889842 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9af5a0b7-5219-4bf8-9e36-87218655227b-operator-scripts\") pod \"9af5a0b7-5219-4bf8-9e36-87218655227b\" (UID: \"9af5a0b7-5219-4bf8-9e36-87218655227b\") " Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.890105 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-combined-ca-bundle\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.890142 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-public-tls-certs\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.890188 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-httpd-config\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.890225 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-config\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.890299 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8qpf\" (UniqueName: \"kubernetes.io/projected/aead053e-0f4a-48bf-b446-9a1dbdc7e996-kube-api-access-g8qpf\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.890329 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-ovndb-tls-certs\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.890348 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-internal-tls-certs\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.891055 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9af5a0b7-5219-4bf8-9e36-87218655227b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9af5a0b7-5219-4bf8-9e36-87218655227b" (UID: "9af5a0b7-5219-4bf8-9e36-87218655227b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.900923 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af5a0b7-5219-4bf8-9e36-87218655227b-kube-api-access-l7nqw" (OuterVolumeSpecName: "kube-api-access-l7nqw") pod "9af5a0b7-5219-4bf8-9e36-87218655227b" (UID: "9af5a0b7-5219-4bf8-9e36-87218655227b"). InnerVolumeSpecName "kube-api-access-l7nqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.992683 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-ovndb-tls-certs\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.992747 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-internal-tls-certs\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.992774 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-combined-ca-bundle\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.992799 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-public-tls-certs\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.992840 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-httpd-config\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.992878 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-config\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.992966 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8qpf\" (UniqueName: \"kubernetes.io/projected/aead053e-0f4a-48bf-b446-9a1dbdc7e996-kube-api-access-g8qpf\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.993031 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9af5a0b7-5219-4bf8-9e36-87218655227b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:27 crc kubenswrapper[4848]: I1206 15:49:27.993047 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7nqw\" (UniqueName: \"kubernetes.io/projected/9af5a0b7-5219-4bf8-9e36-87218655227b-kube-api-access-l7nqw\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.003532 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-httpd-config\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.004159 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-internal-tls-certs\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.004380 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-ovndb-tls-certs\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.013094 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-combined-ca-bundle\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.013996 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-config\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.019238 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aead053e-0f4a-48bf-b446-9a1dbdc7e996-public-tls-certs\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.026222 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8qpf\" (UniqueName: \"kubernetes.io/projected/aead053e-0f4a-48bf-b446-9a1dbdc7e996-kube-api-access-g8qpf\") pod \"neutron-6b7d47d5c9-wf778\" (UID: \"aead053e-0f4a-48bf-b446-9a1dbdc7e996\") " pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.158764 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.342788 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f14ea76-e339-4e47-9063-898de1d2fac8","Type":"ContainerStarted","Data":"39c8ac10835ea32c9fa631d35e2ed6451f9ff5fa0561503613fb4545acb25873"} Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.350061 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4ae7cfb-1f4e-4947-94c8-d358f5e36476" containerID="7c693a591a909a1baedb6df1ea688682e5b6abacaf3c4cde2b528e2328a4e385" exitCode=0 Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.350117 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" event={"ID":"f4ae7cfb-1f4e-4947-94c8-d358f5e36476","Type":"ContainerDied","Data":"7c693a591a909a1baedb6df1ea688682e5b6abacaf3c4cde2b528e2328a4e385"} Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.354309 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.388659 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5600511-fa48-4bd3-98be-0e823cac69b2","Type":"ContainerStarted","Data":"4f1dbaefbc8189682f09640c39bf6e00ceab627e5f8b870492b8326d762c2f08"} Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.394805 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec6fc2a8-0d31-49f6-98be-60e56583631c","Type":"ContainerStarted","Data":"7d4d795618e6a64a2089b08aa686ddfd43fda77d755214206db297d0c3a570ec"} Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.408158 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5ct5k" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.408810 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5ct5k" event={"ID":"9af5a0b7-5219-4bf8-9e36-87218655227b","Type":"ContainerDied","Data":"d9e19266172e47e31020c860530f748f1c3c9ccc0a264aea2da1b21762d38024"} Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.408848 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9e19266172e47e31020c860530f748f1c3c9ccc0a264aea2da1b21762d38024" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.424479 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.640299964 podStartE2EDuration="8.424465281s" podCreationTimestamp="2025-12-06 15:49:20 +0000 UTC" firstStartedPulling="2025-12-06 15:49:22.091804656 +0000 UTC m=+1229.389815569" lastFinishedPulling="2025-12-06 15:49:24.875969973 +0000 UTC m=+1232.173980886" observedRunningTime="2025-12-06 15:49:28.421958073 +0000 UTC m=+1235.719968976" watchObservedRunningTime="2025-12-06 15:49:28.424465281 +0000 UTC m=+1235.722476194" Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.429736 4848 generic.go:334] "Generic (PLEG): container finished" podID="7d15aeb9-60fe-4de1-a715-7843431f9f7f" containerID="725cf6c1857963b710d76c9bc2b2ed1a2af1d1e43ab68b22b140a0741d3cc914" exitCode=0 Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.429826 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b459-account-create-update-7d7s8" event={"ID":"7d15aeb9-60fe-4de1-a715-7843431f9f7f","Type":"ContainerDied","Data":"725cf6c1857963b710d76c9bc2b2ed1a2af1d1e43ab68b22b140a0741d3cc914"} Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.432335 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08ab771a-21e5-4145-8954-8ac8c039a8c4","Type":"ContainerStarted","Data":"07022e4b7fb60f0e1dba6f92c0180584cb86bd6aa713d6c87e257cff456b78ff"} Dec 06 15:49:28 crc kubenswrapper[4848]: W1206 15:49:28.453722 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61e60c86_1fae_4b73_9c2c_bb5bdd108630.slice/crio-81c7d810f16e59d6095631773efa6b3f632991096b7381d35d941cf699776d0e WatchSource:0}: Error finding container 81c7d810f16e59d6095631773efa6b3f632991096b7381d35d941cf699776d0e: Status 404 returned error can't find the container with id 81c7d810f16e59d6095631773efa6b3f632991096b7381d35d941cf699776d0e Dec 06 15:49:28 crc kubenswrapper[4848]: I1206 15:49:28.996687 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d65a058-3a8e-4083-8689-8627bf58faab" path="/var/lib/kubelet/pods/3d65a058-3a8e-4083-8689-8627bf58faab/volumes" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.172975 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7d9-account-create-update-fx892" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.214665 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81599cc7-40f1-4f1c-96e5-9465b89b0517-operator-scripts\") pod \"81599cc7-40f1-4f1c-96e5-9465b89b0517\" (UID: \"81599cc7-40f1-4f1c-96e5-9465b89b0517\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.214845 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9nqj\" (UniqueName: \"kubernetes.io/projected/81599cc7-40f1-4f1c-96e5-9465b89b0517-kube-api-access-x9nqj\") pod \"81599cc7-40f1-4f1c-96e5-9465b89b0517\" (UID: \"81599cc7-40f1-4f1c-96e5-9465b89b0517\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.224455 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81599cc7-40f1-4f1c-96e5-9465b89b0517-kube-api-access-x9nqj" (OuterVolumeSpecName: "kube-api-access-x9nqj") pod "81599cc7-40f1-4f1c-96e5-9465b89b0517" (UID: "81599cc7-40f1-4f1c-96e5-9465b89b0517"). InnerVolumeSpecName "kube-api-access-x9nqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.228612 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wxr2n" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.231276 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81599cc7-40f1-4f1c-96e5-9465b89b0517-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81599cc7-40f1-4f1c-96e5-9465b89b0517" (UID: "81599cc7-40f1-4f1c-96e5-9465b89b0517"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.249506 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9t25c" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.251248 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.324775 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328169 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9624a1f-b378-4f0a-af01-12f45f8c7694-operator-scripts\") pod \"d9624a1f-b378-4f0a-af01-12f45f8c7694\" (UID: \"d9624a1f-b378-4f0a-af01-12f45f8c7694\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328242 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-combined-ca-bundle\") pod \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328310 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a75f41ed-628b-4e88-8d67-ada299f1c7a9-etc-podinfo\") pod \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328358 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4jvj\" (UniqueName: \"kubernetes.io/projected/e3358e13-6e57-435c-ba4f-671c970019fd-kube-api-access-n4jvj\") pod \"e3358e13-6e57-435c-ba4f-671c970019fd\" (UID: \"e3358e13-6e57-435c-ba4f-671c970019fd\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328379 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3358e13-6e57-435c-ba4f-671c970019fd-operator-scripts\") pod \"e3358e13-6e57-435c-ba4f-671c970019fd\" (UID: \"e3358e13-6e57-435c-ba4f-671c970019fd\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328430 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt5dx\" (UniqueName: \"kubernetes.io/projected/d9624a1f-b378-4f0a-af01-12f45f8c7694-kube-api-access-zt5dx\") pod \"d9624a1f-b378-4f0a-af01-12f45f8c7694\" (UID: \"d9624a1f-b378-4f0a-af01-12f45f8c7694\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328458 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkvjh\" (UniqueName: \"kubernetes.io/projected/a75f41ed-628b-4e88-8d67-ada299f1c7a9-kube-api-access-jkvjh\") pod \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328487 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data-merged\") pod \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328573 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-scripts\") pod \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328624 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data\") pod \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\" (UID: \"a75f41ed-628b-4e88-8d67-ada299f1c7a9\") " Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328974 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81599cc7-40f1-4f1c-96e5-9465b89b0517-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.328984 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9nqj\" (UniqueName: \"kubernetes.io/projected/81599cc7-40f1-4f1c-96e5-9465b89b0517-kube-api-access-x9nqj\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.330203 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9624a1f-b378-4f0a-af01-12f45f8c7694-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d9624a1f-b378-4f0a-af01-12f45f8c7694" (UID: "d9624a1f-b378-4f0a-af01-12f45f8c7694"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.331822 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3358e13-6e57-435c-ba4f-671c970019fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3358e13-6e57-435c-ba4f-671c970019fd" (UID: "e3358e13-6e57-435c-ba4f-671c970019fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.333483 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "a75f41ed-628b-4e88-8d67-ada299f1c7a9" (UID: "a75f41ed-628b-4e88-8d67-ada299f1c7a9"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.347432 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a75f41ed-628b-4e88-8d67-ada299f1c7a9-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "a75f41ed-628b-4e88-8d67-ada299f1c7a9" (UID: "a75f41ed-628b-4e88-8d67-ada299f1c7a9"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.347628 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76858ffddc-pvnks" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.347687 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75f41ed-628b-4e88-8d67-ada299f1c7a9-kube-api-access-jkvjh" (OuterVolumeSpecName: "kube-api-access-jkvjh") pod "a75f41ed-628b-4e88-8d67-ada299f1c7a9" (UID: "a75f41ed-628b-4e88-8d67-ada299f1c7a9"). InnerVolumeSpecName "kube-api-access-jkvjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.350728 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9624a1f-b378-4f0a-af01-12f45f8c7694-kube-api-access-zt5dx" (OuterVolumeSpecName: "kube-api-access-zt5dx") pod "d9624a1f-b378-4f0a-af01-12f45f8c7694" (UID: "d9624a1f-b378-4f0a-af01-12f45f8c7694"). InnerVolumeSpecName "kube-api-access-zt5dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.355621 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3358e13-6e57-435c-ba4f-671c970019fd-kube-api-access-n4jvj" (OuterVolumeSpecName: "kube-api-access-n4jvj") pod "e3358e13-6e57-435c-ba4f-671c970019fd" (UID: "e3358e13-6e57-435c-ba4f-671c970019fd"). InnerVolumeSpecName "kube-api-access-n4jvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.357234 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-scripts" (OuterVolumeSpecName: "scripts") pod "a75f41ed-628b-4e88-8d67-ada299f1c7a9" (UID: "a75f41ed-628b-4e88-8d67-ada299f1c7a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.369641 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b7d47d5c9-wf778"] Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.392978 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data" (OuterVolumeSpecName: "config-data") pod "a75f41ed-628b-4e88-8d67-ada299f1c7a9" (UID: "a75f41ed-628b-4e88-8d67-ada299f1c7a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.430527 4848 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a75f41ed-628b-4e88-8d67-ada299f1c7a9-etc-podinfo\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.430562 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4jvj\" (UniqueName: \"kubernetes.io/projected/e3358e13-6e57-435c-ba4f-671c970019fd-kube-api-access-n4jvj\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.430575 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3358e13-6e57-435c-ba4f-671c970019fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.430584 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt5dx\" (UniqueName: \"kubernetes.io/projected/d9624a1f-b378-4f0a-af01-12f45f8c7694-kube-api-access-zt5dx\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.430592 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkvjh\" (UniqueName: \"kubernetes.io/projected/a75f41ed-628b-4e88-8d67-ada299f1c7a9-kube-api-access-jkvjh\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.430600 4848 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.430609 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.430618 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.430628 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9624a1f-b378-4f0a-af01-12f45f8c7694-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.441906 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a75f41ed-628b-4e88-8d67-ada299f1c7a9" (UID: "a75f41ed-628b-4e88-8d67-ada299f1c7a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.540975 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75f41ed-628b-4e88-8d67-ada299f1c7a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.556893 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08ab771a-21e5-4145-8954-8ac8c039a8c4","Type":"ContainerStarted","Data":"26cc3eb94f2a08d5353496f945562c0830c11944242740b97065611e431446d2"} Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.587942 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f14ea76-e339-4e47-9063-898de1d2fac8","Type":"ContainerStarted","Data":"8c78173c94223b398f8782405d43f6c1ee69b4f98c967a267e24991c18936efe"} Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.599904 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7d9-account-create-update-fx892" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.600795 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7d9-account-create-update-fx892" event={"ID":"81599cc7-40f1-4f1c-96e5-9465b89b0517","Type":"ContainerDied","Data":"c4d8c1e306b30722090a9bfc31e5eae81cc5f19a2f94fd4e7fb08f83b1d61e72"} Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.600838 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4d8c1e306b30722090a9bfc31e5eae81cc5f19a2f94fd4e7fb08f83b1d61e72" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.614761 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-5sg59"] Dec 06 15:49:29 crc kubenswrapper[4848]: E1206 15:49:29.615179 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af5a0b7-5219-4bf8-9e36-87218655227b" containerName="mariadb-database-create" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.615191 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af5a0b7-5219-4bf8-9e36-87218655227b" containerName="mariadb-database-create" Dec 06 15:49:29 crc kubenswrapper[4848]: E1206 15:49:29.615203 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3358e13-6e57-435c-ba4f-671c970019fd" containerName="mariadb-database-create" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.615212 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3358e13-6e57-435c-ba4f-671c970019fd" containerName="mariadb-database-create" Dec 06 15:49:29 crc kubenswrapper[4848]: E1206 15:49:29.615230 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75f41ed-628b-4e88-8d67-ada299f1c7a9" containerName="init" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.615235 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75f41ed-628b-4e88-8d67-ada299f1c7a9" containerName="init" Dec 06 15:49:29 crc kubenswrapper[4848]: E1206 15:49:29.615251 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75f41ed-628b-4e88-8d67-ada299f1c7a9" containerName="ironic-db-sync" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.615256 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75f41ed-628b-4e88-8d67-ada299f1c7a9" containerName="ironic-db-sync" Dec 06 15:49:29 crc kubenswrapper[4848]: E1206 15:49:29.615263 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81599cc7-40f1-4f1c-96e5-9465b89b0517" containerName="mariadb-account-create-update" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.615268 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="81599cc7-40f1-4f1c-96e5-9465b89b0517" containerName="mariadb-account-create-update" Dec 06 15:49:29 crc kubenswrapper[4848]: E1206 15:49:29.615278 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9624a1f-b378-4f0a-af01-12f45f8c7694" containerName="mariadb-database-create" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.615284 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9624a1f-b378-4f0a-af01-12f45f8c7694" containerName="mariadb-database-create" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.615435 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af5a0b7-5219-4bf8-9e36-87218655227b" containerName="mariadb-database-create" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.615447 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="81599cc7-40f1-4f1c-96e5-9465b89b0517" containerName="mariadb-account-create-update" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.615462 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75f41ed-628b-4e88-8d67-ada299f1c7a9" containerName="ironic-db-sync" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.615474 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3358e13-6e57-435c-ba4f-671c970019fd" containerName="mariadb-database-create" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.615483 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9624a1f-b378-4f0a-af01-12f45f8c7694" containerName="mariadb-database-create" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.616104 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-5sg59" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.643348 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03352dd8-a07d-4822-adcb-64517cb16b2e-operator-scripts\") pod \"ironic-inspector-db-create-5sg59\" (UID: \"03352dd8-a07d-4822-adcb-64517cb16b2e\") " pod="openstack/ironic-inspector-db-create-5sg59" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.643402 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qjg9\" (UniqueName: \"kubernetes.io/projected/03352dd8-a07d-4822-adcb-64517cb16b2e-kube-api-access-4qjg9\") pod \"ironic-inspector-db-create-5sg59\" (UID: \"03352dd8-a07d-4822-adcb-64517cb16b2e\") " pod="openstack/ironic-inspector-db-create-5sg59" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.648930 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-5sg59"] Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.697971 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-7xkpk" event={"ID":"a75f41ed-628b-4e88-8d67-ada299f1c7a9","Type":"ContainerDied","Data":"3d9ea34303a215aa37b3a328b5b46761dac59cd0123e29796cfb8d7419d66fea"} Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.698006 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d9ea34303a215aa37b3a328b5b46761dac59cd0123e29796cfb8d7419d66fea" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.698080 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-7xkpk" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.704677 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7d47d5c9-wf778" event={"ID":"aead053e-0f4a-48bf-b446-9a1dbdc7e996","Type":"ContainerStarted","Data":"a717ed0abe73ba74aecb0ccc926417ba5b2059920a50c85a15a87532ebc131dc"} Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.724888 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-572c-account-create-update-j47lb"] Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.728505 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-572c-account-create-update-j47lb" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.731358 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.749641 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5600511-fa48-4bd3-98be-0e823cac69b2","Type":"ContainerStarted","Data":"e335ea420ed7e10a820c39a04443bef173e40f2138d640b155c4820b20f461a7"} Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.749948 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="ceilometer-central-agent" containerID="cri-o://461fff830f900e9841f5039799776ca4845958258b2970d2b68d2ed071f5145b" gracePeriod=30 Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.750126 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.750171 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="proxy-httpd" containerID="cri-o://e335ea420ed7e10a820c39a04443bef173e40f2138d640b155c4820b20f461a7" gracePeriod=30 Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.750209 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="sg-core" containerID="cri-o://4f1dbaefbc8189682f09640c39bf6e00ceab627e5f8b870492b8326d762c2f08" gracePeriod=30 Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.750249 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="ceilometer-notification-agent" containerID="cri-o://8c0e142bdef49cd5e2663be8c1ca22e6a36d0866224f6a9ca9065b17aad9e042" gracePeriod=30 Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.753308 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03352dd8-a07d-4822-adcb-64517cb16b2e-operator-scripts\") pod \"ironic-inspector-db-create-5sg59\" (UID: \"03352dd8-a07d-4822-adcb-64517cb16b2e\") " pod="openstack/ironic-inspector-db-create-5sg59" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.753351 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qjg9\" (UniqueName: \"kubernetes.io/projected/03352dd8-a07d-4822-adcb-64517cb16b2e-kube-api-access-4qjg9\") pod \"ironic-inspector-db-create-5sg59\" (UID: \"03352dd8-a07d-4822-adcb-64517cb16b2e\") " pod="openstack/ironic-inspector-db-create-5sg59" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.753407 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69d0d42e-a46f-49c4-a637-81da68446876-operator-scripts\") pod \"ironic-inspector-572c-account-create-update-j47lb\" (UID: \"69d0d42e-a46f-49c4-a637-81da68446876\") " pod="openstack/ironic-inspector-572c-account-create-update-j47lb" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.753822 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kld66\" (UniqueName: \"kubernetes.io/projected/69d0d42e-a46f-49c4-a637-81da68446876-kube-api-access-kld66\") pod \"ironic-inspector-572c-account-create-update-j47lb\" (UID: \"69d0d42e-a46f-49c4-a637-81da68446876\") " pod="openstack/ironic-inspector-572c-account-create-update-j47lb" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.754548 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03352dd8-a07d-4822-adcb-64517cb16b2e-operator-scripts\") pod \"ironic-inspector-db-create-5sg59\" (UID: \"03352dd8-a07d-4822-adcb-64517cb16b2e\") " pod="openstack/ironic-inspector-db-create-5sg59" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.769522 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-5f6db98496-rh44f"] Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.770760 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.774820 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-dockercfg-zq8gd" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.782674 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-572c-account-create-update-j47lb"] Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.786728 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-5f6db98496-rh44f"] Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.801865 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.821278 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9t25c" event={"ID":"d9624a1f-b378-4f0a-af01-12f45f8c7694","Type":"ContainerDied","Data":"d477ebe9cf9bc66a7f1685093708f92f1c4a9605b2dd930f05f38ffddb34485d"} Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.821744 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d477ebe9cf9bc66a7f1685093708f92f1c4a9605b2dd930f05f38ffddb34485d" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.821823 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9t25c" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.822775 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qjg9\" (UniqueName: \"kubernetes.io/projected/03352dd8-a07d-4822-adcb-64517cb16b2e-kube-api-access-4qjg9\") pod \"ironic-inspector-db-create-5sg59\" (UID: \"03352dd8-a07d-4822-adcb-64517cb16b2e\") " pod="openstack/ironic-inspector-db-create-5sg59" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.830633 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wxr2n" event={"ID":"e3358e13-6e57-435c-ba4f-671c970019fd","Type":"ContainerDied","Data":"e3f6185f198219f5bedbffe580e5e8324ef882c3843e4f685892c5776ef0ae10"} Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.830678 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f6185f198219f5bedbffe580e5e8324ef882c3843e4f685892c5776ef0ae10" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.830762 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wxr2n" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.843840 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"61e60c86-1fae-4b73-9c2c-bb5bdd108630","Type":"ContainerStarted","Data":"81c7d810f16e59d6095631773efa6b3f632991096b7381d35d941cf699776d0e"} Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.865385 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69d0d42e-a46f-49c4-a637-81da68446876-operator-scripts\") pod \"ironic-inspector-572c-account-create-update-j47lb\" (UID: \"69d0d42e-a46f-49c4-a637-81da68446876\") " pod="openstack/ironic-inspector-572c-account-create-update-j47lb" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.865529 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kld66\" (UniqueName: \"kubernetes.io/projected/69d0d42e-a46f-49c4-a637-81da68446876-kube-api-access-kld66\") pod \"ironic-inspector-572c-account-create-update-j47lb\" (UID: \"69d0d42e-a46f-49c4-a637-81da68446876\") " pod="openstack/ironic-inspector-572c-account-create-update-j47lb" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.869926 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69d0d42e-a46f-49c4-a637-81da68446876-operator-scripts\") pod \"ironic-inspector-572c-account-create-update-j47lb\" (UID: \"69d0d42e-a46f-49c4-a637-81da68446876\") " pod="openstack/ironic-inspector-572c-account-create-update-j47lb" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.897629 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kld66\" (UniqueName: \"kubernetes.io/projected/69d0d42e-a46f-49c4-a637-81da68446876-kube-api-access-kld66\") pod \"ironic-inspector-572c-account-create-update-j47lb\" (UID: \"69d0d42e-a46f-49c4-a637-81da68446876\") " pod="openstack/ironic-inspector-572c-account-create-update-j47lb" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.914495 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-572c-account-create-update-j47lb" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.923551 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.690459713 podStartE2EDuration="9.923530979s" podCreationTimestamp="2025-12-06 15:49:20 +0000 UTC" firstStartedPulling="2025-12-06 15:49:21.516947923 +0000 UTC m=+1228.814958836" lastFinishedPulling="2025-12-06 15:49:28.750019189 +0000 UTC m=+1236.048030102" observedRunningTime="2025-12-06 15:49:29.88918952 +0000 UTC m=+1237.187200433" watchObservedRunningTime="2025-12-06 15:49:29.923530979 +0000 UTC m=+1237.221541892" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.924165 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-7f6646d5b4-tzftd"] Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.926040 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.932162 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.932366 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.932879 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.933055 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.963751 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7f6646d5b4-tzftd"] Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.972746 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/692f44d3-ff17-419f-b16c-b37f71521603-config\") pod \"ironic-neutron-agent-5f6db98496-rh44f\" (UID: \"692f44d3-ff17-419f-b16c-b37f71521603\") " pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.972828 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24k2\" (UniqueName: \"kubernetes.io/projected/692f44d3-ff17-419f-b16c-b37f71521603-kube-api-access-g24k2\") pod \"ironic-neutron-agent-5f6db98496-rh44f\" (UID: \"692f44d3-ff17-419f-b16c-b37f71521603\") " pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:29 crc kubenswrapper[4848]: I1206 15:49:29.972867 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692f44d3-ff17-419f-b16c-b37f71521603-combined-ca-bundle\") pod \"ironic-neutron-agent-5f6db98496-rh44f\" (UID: \"692f44d3-ff17-419f-b16c-b37f71521603\") " pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.030993 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-5sg59" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.074642 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8s2s\" (UniqueName: \"kubernetes.io/projected/cd7ef2f0-4fc0-4e48-a862-7818d1989187-kube-api-access-t8s2s\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.074871 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/692f44d3-ff17-419f-b16c-b37f71521603-config\") pod \"ironic-neutron-agent-5f6db98496-rh44f\" (UID: \"692f44d3-ff17-419f-b16c-b37f71521603\") " pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.074928 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-logs\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.074952 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-scripts\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.074980 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-custom\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.075031 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.075077 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-merged\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.075104 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-combined-ca-bundle\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.075129 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g24k2\" (UniqueName: \"kubernetes.io/projected/692f44d3-ff17-419f-b16c-b37f71521603-kube-api-access-g24k2\") pod \"ironic-neutron-agent-5f6db98496-rh44f\" (UID: \"692f44d3-ff17-419f-b16c-b37f71521603\") " pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.075188 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cd7ef2f0-4fc0-4e48-a862-7818d1989187-etc-podinfo\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.075228 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692f44d3-ff17-419f-b16c-b37f71521603-combined-ca-bundle\") pod \"ironic-neutron-agent-5f6db98496-rh44f\" (UID: \"692f44d3-ff17-419f-b16c-b37f71521603\") " pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.088232 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/692f44d3-ff17-419f-b16c-b37f71521603-config\") pod \"ironic-neutron-agent-5f6db98496-rh44f\" (UID: \"692f44d3-ff17-419f-b16c-b37f71521603\") " pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.094743 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692f44d3-ff17-419f-b16c-b37f71521603-combined-ca-bundle\") pod \"ironic-neutron-agent-5f6db98496-rh44f\" (UID: \"692f44d3-ff17-419f-b16c-b37f71521603\") " pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.112333 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24k2\" (UniqueName: \"kubernetes.io/projected/692f44d3-ff17-419f-b16c-b37f71521603-kube-api-access-g24k2\") pod \"ironic-neutron-agent-5f6db98496-rh44f\" (UID: \"692f44d3-ff17-419f-b16c-b37f71521603\") " pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.178264 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-logs\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.178322 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-scripts\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.178351 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-custom\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.178386 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.178407 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-merged\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.178434 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-combined-ca-bundle\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.178470 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cd7ef2f0-4fc0-4e48-a862-7818d1989187-etc-podinfo\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.178524 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8s2s\" (UniqueName: \"kubernetes.io/projected/cd7ef2f0-4fc0-4e48-a862-7818d1989187-kube-api-access-t8s2s\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.179295 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-logs\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.181196 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-merged\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.205041 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8s2s\" (UniqueName: \"kubernetes.io/projected/cd7ef2f0-4fc0-4e48-a862-7818d1989187-kube-api-access-t8s2s\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.216233 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-scripts\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.218201 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cd7ef2f0-4fc0-4e48-a862-7818d1989187-etc-podinfo\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.221931 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-custom\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.224148 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.235248 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.290157 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-combined-ca-bundle\") pod \"ironic-7f6646d5b4-tzftd\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.298497 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.688882 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.692895 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.696161 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.696633 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.705812 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.798081 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-scripts\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.798414 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ab198686-7839-4e39-abdb-ea9b65893a02-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.798441 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ab198686-7839-4e39-abdb-ea9b65893a02-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.798473 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.798497 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.798564 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.798604 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxw9r\" (UniqueName: \"kubernetes.io/projected/ab198686-7839-4e39-abdb-ea9b65893a02-kube-api-access-vxw9r\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.798629 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-config-data\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.868053 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b459-account-create-update-7d7s8" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.886259 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.902766 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxw9r\" (UniqueName: \"kubernetes.io/projected/ab198686-7839-4e39-abdb-ea9b65893a02-kube-api-access-vxw9r\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.902812 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-config-data\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.902837 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-scripts\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.902916 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ab198686-7839-4e39-abdb-ea9b65893a02-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.902939 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ab198686-7839-4e39-abdb-ea9b65893a02-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.902970 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.902995 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.903020 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.911098 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ab198686-7839-4e39-abdb-ea9b65893a02-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.920719 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.926805 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-572c-account-create-update-j47lb"] Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.932294 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ab198686-7839-4e39-abdb-ea9b65893a02-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.932717 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.937820 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-scripts\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.939521 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.940731 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f14ea76-e339-4e47-9063-898de1d2fac8","Type":"ContainerStarted","Data":"1d3c8a5ac9e041952c2b419870194e0ad7e44a4849a55898ce705564d3e6451f"} Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.949511 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab198686-7839-4e39-abdb-ea9b65893a02-config-data\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.957374 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"61e60c86-1fae-4b73-9c2c-bb5bdd108630","Type":"ContainerStarted","Data":"b5d2a30ac0f4b181fc7bba28d28ef43c39f73f81ed15c1db5f4c3b214f3954e8"} Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.959316 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxw9r\" (UniqueName: \"kubernetes.io/projected/ab198686-7839-4e39-abdb-ea9b65893a02-kube-api-access-vxw9r\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.969806 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.969788217 podStartE2EDuration="4.969788217s" podCreationTimestamp="2025-12-06 15:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:30.969770086 +0000 UTC m=+1238.267780999" watchObservedRunningTime="2025-12-06 15:49:30.969788217 +0000 UTC m=+1238.267799130" Dec 06 15:49:30 crc kubenswrapper[4848]: I1206 15:49:30.995235 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.004323 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-operator-scripts\") pod \"f4ae7cfb-1f4e-4947-94c8-d358f5e36476\" (UID: \"f4ae7cfb-1f4e-4947-94c8-d358f5e36476\") " Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.004446 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d15aeb9-60fe-4de1-a715-7843431f9f7f-operator-scripts\") pod \"7d15aeb9-60fe-4de1-a715-7843431f9f7f\" (UID: \"7d15aeb9-60fe-4de1-a715-7843431f9f7f\") " Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.004526 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svvtz\" (UniqueName: \"kubernetes.io/projected/7d15aeb9-60fe-4de1-a715-7843431f9f7f-kube-api-access-svvtz\") pod \"7d15aeb9-60fe-4de1-a715-7843431f9f7f\" (UID: \"7d15aeb9-60fe-4de1-a715-7843431f9f7f\") " Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.004598 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h77f\" (UniqueName: \"kubernetes.io/projected/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-kube-api-access-2h77f\") pod \"f4ae7cfb-1f4e-4947-94c8-d358f5e36476\" (UID: \"f4ae7cfb-1f4e-4947-94c8-d358f5e36476\") " Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.005242 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4ae7cfb-1f4e-4947-94c8-d358f5e36476" (UID: "f4ae7cfb-1f4e-4947-94c8-d358f5e36476"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.005475 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.006084 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d15aeb9-60fe-4de1-a715-7843431f9f7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d15aeb9-60fe-4de1-a715-7843431f9f7f" (UID: "7d15aeb9-60fe-4de1-a715-7843431f9f7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.013289 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d15aeb9-60fe-4de1-a715-7843431f9f7f-kube-api-access-svvtz" (OuterVolumeSpecName: "kube-api-access-svvtz") pod "7d15aeb9-60fe-4de1-a715-7843431f9f7f" (UID: "7d15aeb9-60fe-4de1-a715-7843431f9f7f"). InnerVolumeSpecName "kube-api-access-svvtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.019812 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-kube-api-access-2h77f" (OuterVolumeSpecName: "kube-api-access-2h77f") pod "f4ae7cfb-1f4e-4947-94c8-d358f5e36476" (UID: "f4ae7cfb-1f4e-4947-94c8-d358f5e36476"). InnerVolumeSpecName "kube-api-access-2h77f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.027368 4848 generic.go:334] "Generic (PLEG): container finished" podID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerID="e335ea420ed7e10a820c39a04443bef173e40f2138d640b155c4820b20f461a7" exitCode=0 Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.027553 4848 generic.go:334] "Generic (PLEG): container finished" podID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerID="4f1dbaefbc8189682f09640c39bf6e00ceab627e5f8b870492b8326d762c2f08" exitCode=2 Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.027642 4848 generic.go:334] "Generic (PLEG): container finished" podID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerID="8c0e142bdef49cd5e2663be8c1ca22e6a36d0866224f6a9ca9065b17aad9e042" exitCode=0 Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.027679 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7d47d5c9-wf778" event={"ID":"aead053e-0f4a-48bf-b446-9a1dbdc7e996","Type":"ContainerStarted","Data":"94bc1003cbea8e37ad32ac661bf683d8a4bd98a714a495a67727991a1ddf62c6"} Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.027745 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8c96-account-create-update-r6sqf" event={"ID":"f4ae7cfb-1f4e-4947-94c8-d358f5e36476","Type":"ContainerDied","Data":"2c851c06d7c548464ce169820e4bf26326bf1387b45b539310ddc6f7034f9d0f"} Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.027764 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c851c06d7c548464ce169820e4bf26326bf1387b45b539310ddc6f7034f9d0f" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.027776 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5600511-fa48-4bd3-98be-0e823cac69b2","Type":"ContainerDied","Data":"e335ea420ed7e10a820c39a04443bef173e40f2138d640b155c4820b20f461a7"} Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.027797 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5600511-fa48-4bd3-98be-0e823cac69b2","Type":"ContainerDied","Data":"4f1dbaefbc8189682f09640c39bf6e00ceab627e5f8b870492b8326d762c2f08"} Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.027809 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5600511-fa48-4bd3-98be-0e823cac69b2","Type":"ContainerDied","Data":"8c0e142bdef49cd5e2663be8c1ca22e6a36d0866224f6a9ca9065b17aad9e042"} Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.027973 4848 generic.go:334] "Generic (PLEG): container finished" podID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerID="461fff830f900e9841f5039799776ca4845958258b2970d2b68d2ed071f5145b" exitCode=0 Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.028134 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5600511-fa48-4bd3-98be-0e823cac69b2","Type":"ContainerDied","Data":"461fff830f900e9841f5039799776ca4845958258b2970d2b68d2ed071f5145b"} Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.039509 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ironic-conductor-0\" (UID: \"ab198686-7839-4e39-abdb-ea9b65893a02\") " pod="openstack/ironic-conductor-0" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.044881 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b459-account-create-update-7d7s8" event={"ID":"7d15aeb9-60fe-4de1-a715-7843431f9f7f","Type":"ContainerDied","Data":"d42afee2445df551cb7e3d2f01295ec48a9544fff088bfcc391bacaf752b0b30"} Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.044921 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d42afee2445df551cb7e3d2f01295ec48a9544fff088bfcc391bacaf752b0b30" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.044989 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b459-account-create-update-7d7s8" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.056575 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08ab771a-21e5-4145-8954-8ac8c039a8c4","Type":"ContainerStarted","Data":"c729a323c07552c39e398469aafdc1f0e71aa1d897fd29af09e3491deec74c2b"} Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.107031 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d15aeb9-60fe-4de1-a715-7843431f9f7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.107264 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svvtz\" (UniqueName: \"kubernetes.io/projected/7d15aeb9-60fe-4de1-a715-7843431f9f7f-kube-api-access-svvtz\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.107370 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h77f\" (UniqueName: \"kubernetes.io/projected/f4ae7cfb-1f4e-4947-94c8-d358f5e36476-kube-api-access-2h77f\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.130016 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-5sg59"] Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.136378 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.136354834 podStartE2EDuration="5.136354834s" podCreationTimestamp="2025-12-06 15:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:31.104924593 +0000 UTC m=+1238.402935516" watchObservedRunningTime="2025-12-06 15:49:31.136354834 +0000 UTC m=+1238.434365747" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.202649 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.351085 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.393783 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-5f6db98496-rh44f"] Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.402363 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7f6646d5b4-tzftd"] Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.576955 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.657874 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-kp76h"] Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.658122 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" podUID="7773ec39-baea-46cd-bd39-520ba343805d" containerName="dnsmasq-dns" containerID="cri-o://b1b36c9f28117d863bdf25da88665a65d2890fc8af18ab559c55fc4c67f33b56" gracePeriod=10 Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.757066 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.762246 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.927265 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-combined-ca-bundle\") pod \"c5600511-fa48-4bd3-98be-0e823cac69b2\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.927717 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-run-httpd\") pod \"c5600511-fa48-4bd3-98be-0e823cac69b2\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.927882 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-sg-core-conf-yaml\") pod \"c5600511-fa48-4bd3-98be-0e823cac69b2\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.927974 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-config-data\") pod \"c5600511-fa48-4bd3-98be-0e823cac69b2\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.930797 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c5600511-fa48-4bd3-98be-0e823cac69b2" (UID: "c5600511-fa48-4bd3-98be-0e823cac69b2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.931469 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-scripts\") pod \"c5600511-fa48-4bd3-98be-0e823cac69b2\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.931529 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wx69\" (UniqueName: \"kubernetes.io/projected/c5600511-fa48-4bd3-98be-0e823cac69b2-kube-api-access-6wx69\") pod \"c5600511-fa48-4bd3-98be-0e823cac69b2\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.931570 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-log-httpd\") pod \"c5600511-fa48-4bd3-98be-0e823cac69b2\" (UID: \"c5600511-fa48-4bd3-98be-0e823cac69b2\") " Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.932918 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.934010 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c5600511-fa48-4bd3-98be-0e823cac69b2" (UID: "c5600511-fa48-4bd3-98be-0e823cac69b2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.958934 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5600511-fa48-4bd3-98be-0e823cac69b2-kube-api-access-6wx69" (OuterVolumeSpecName: "kube-api-access-6wx69") pod "c5600511-fa48-4bd3-98be-0e823cac69b2" (UID: "c5600511-fa48-4bd3-98be-0e823cac69b2"). InnerVolumeSpecName "kube-api-access-6wx69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.963613 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-scripts" (OuterVolumeSpecName: "scripts") pod "c5600511-fa48-4bd3-98be-0e823cac69b2" (UID: "c5600511-fa48-4bd3-98be-0e823cac69b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:31 crc kubenswrapper[4848]: I1206 15:49:31.998353 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c5600511-fa48-4bd3-98be-0e823cac69b2" (UID: "c5600511-fa48-4bd3-98be-0e823cac69b2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.019927 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.038098 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.038138 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wx69\" (UniqueName: \"kubernetes.io/projected/c5600511-fa48-4bd3-98be-0e823cac69b2-kube-api-access-6wx69\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.038148 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5600511-fa48-4bd3-98be-0e823cac69b2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.038157 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.070913 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7f6646d5b4-tzftd" event={"ID":"cd7ef2f0-4fc0-4e48-a862-7818d1989187","Type":"ContainerStarted","Data":"a675e2ca175fe626008522b46131978e5052c3fa632fb874e51916d3fb4569ea"} Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.079822 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7d47d5c9-wf778" event={"ID":"aead053e-0f4a-48bf-b446-9a1dbdc7e996","Type":"ContainerStarted","Data":"849621d72cd1c784ae9ca209dc8d263c3ce57cc463c05d66582a47ef9359a2db"} Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.082060 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.091445 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-5sg59" event={"ID":"03352dd8-a07d-4822-adcb-64517cb16b2e","Type":"ContainerStarted","Data":"adf8c38c49c4ae1e78fe43b4be6aa1f2abd586c2601bb7cc16422ec178d3d0d5"} Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.094717 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-572c-account-create-update-j47lb" event={"ID":"69d0d42e-a46f-49c4-a637-81da68446876","Type":"ContainerStarted","Data":"9db133563f492efd945e7d08c68087da88c9986fd92e4025b8585eb29090ae97"} Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.106109 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b7d47d5c9-wf778" podStartSLOduration=5.10607968 podStartE2EDuration="5.10607968s" podCreationTimestamp="2025-12-06 15:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:32.098982408 +0000 UTC m=+1239.396993331" watchObservedRunningTime="2025-12-06 15:49:32.10607968 +0000 UTC m=+1239.404090593" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.107676 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5600511-fa48-4bd3-98be-0e823cac69b2","Type":"ContainerDied","Data":"f46c3cbf9a74afe37af5baa0c78f95b86f29361688fbfb01f97f0602ce62da29"} Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.107833 4848 scope.go:117] "RemoveContainer" containerID="e335ea420ed7e10a820c39a04443bef173e40f2138d640b155c4820b20f461a7" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.108085 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.113836 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-config-data" (OuterVolumeSpecName: "config-data") pod "c5600511-fa48-4bd3-98be-0e823cac69b2" (UID: "c5600511-fa48-4bd3-98be-0e823cac69b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.123190 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5600511-fa48-4bd3-98be-0e823cac69b2" (UID: "c5600511-fa48-4bd3-98be-0e823cac69b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.134525 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" event={"ID":"692f44d3-ff17-419f-b16c-b37f71521603","Type":"ContainerStarted","Data":"781c828887e684f5a416e2777d9dd8450dec9c98807691df8956dcb92f08699c"} Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.136860 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"ab198686-7839-4e39-abdb-ea9b65893a02","Type":"ContainerStarted","Data":"8ccc59f92d98642d7de454dc24ac485e007c4bf4780e5ece0fdfce7b84b42b57"} Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.140225 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.140260 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5600511-fa48-4bd3-98be-0e823cac69b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.169195 4848 scope.go:117] "RemoveContainer" containerID="4f1dbaefbc8189682f09640c39bf6e00ceab627e5f8b870492b8326d762c2f08" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.193074 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.319898 4848 scope.go:117] "RemoveContainer" containerID="8c0e142bdef49cd5e2663be8c1ca22e6a36d0866224f6a9ca9065b17aad9e042" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.367711 4848 scope.go:117] "RemoveContainer" containerID="461fff830f900e9841f5039799776ca4845958258b2970d2b68d2ed071f5145b" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.551635 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.587486 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.638492 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:32 crc kubenswrapper[4848]: E1206 15:49:32.641667 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d15aeb9-60fe-4de1-a715-7843431f9f7f" containerName="mariadb-account-create-update" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.641716 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d15aeb9-60fe-4de1-a715-7843431f9f7f" containerName="mariadb-account-create-update" Dec 06 15:49:32 crc kubenswrapper[4848]: E1206 15:49:32.641736 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="sg-core" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.641743 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="sg-core" Dec 06 15:49:32 crc kubenswrapper[4848]: E1206 15:49:32.641760 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="proxy-httpd" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.641768 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="proxy-httpd" Dec 06 15:49:32 crc kubenswrapper[4848]: E1206 15:49:32.641790 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ae7cfb-1f4e-4947-94c8-d358f5e36476" containerName="mariadb-account-create-update" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.641798 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ae7cfb-1f4e-4947-94c8-d358f5e36476" containerName="mariadb-account-create-update" Dec 06 15:49:32 crc kubenswrapper[4848]: E1206 15:49:32.641845 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="ceilometer-notification-agent" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.641873 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="ceilometer-notification-agent" Dec 06 15:49:32 crc kubenswrapper[4848]: E1206 15:49:32.641895 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="ceilometer-central-agent" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.641903 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="ceilometer-central-agent" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.642402 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d15aeb9-60fe-4de1-a715-7843431f9f7f" containerName="mariadb-account-create-update" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.642421 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4ae7cfb-1f4e-4947-94c8-d358f5e36476" containerName="mariadb-account-create-update" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.642438 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="sg-core" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.642450 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="ceilometer-central-agent" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.642464 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="proxy-httpd" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.642473 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" containerName="ceilometer-notification-agent" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.648027 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.657024 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.657215 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.662257 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.760275 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-run-httpd\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.760707 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.760747 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-config-data\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.760823 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-scripts\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.760859 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb7ds\" (UniqueName: \"kubernetes.io/projected/615e4244-19fe-4703-85b3-86086a3e630d-kube-api-access-hb7ds\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.760911 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.760932 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-log-httpd\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.862375 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-scripts\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.862468 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb7ds\" (UniqueName: \"kubernetes.io/projected/615e4244-19fe-4703-85b3-86086a3e630d-kube-api-access-hb7ds\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.862535 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.862561 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-log-httpd\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.862607 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-run-httpd\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.862658 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.862725 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-config-data\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.863058 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-log-httpd\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.863122 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-run-httpd\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.870721 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.871179 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.873258 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-config-data\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.873915 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-scripts\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.883578 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb7ds\" (UniqueName: \"kubernetes.io/projected/615e4244-19fe-4703-85b3-86086a3e630d-kube-api-access-hb7ds\") pod \"ceilometer-0\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " pod="openstack/ceilometer-0" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.983045 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5600511-fa48-4bd3-98be-0e823cac69b2" path="/var/lib/kubelet/pods/c5600511-fa48-4bd3-98be-0e823cac69b2/volumes" Dec 06 15:49:32 crc kubenswrapper[4848]: I1206 15:49:32.996001 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.173279 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-572c-account-create-update-j47lb" event={"ID":"69d0d42e-a46f-49c4-a637-81da68446876","Type":"ContainerStarted","Data":"d1db1c5f39966e4065b63ca4719f2e1b399ba6f6336fd2f892f6a886f7a61be9"} Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.182126 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"ab198686-7839-4e39-abdb-ea9b65893a02","Type":"ContainerStarted","Data":"aee1cf05686d0a385770beff7c36b40a405a3c4aecae26655f284fd8378c5485"} Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.188126 4848 generic.go:334] "Generic (PLEG): container finished" podID="7773ec39-baea-46cd-bd39-520ba343805d" containerID="b1b36c9f28117d863bdf25da88665a65d2890fc8af18ab559c55fc4c67f33b56" exitCode=0 Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.188209 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" event={"ID":"7773ec39-baea-46cd-bd39-520ba343805d","Type":"ContainerDied","Data":"b1b36c9f28117d863bdf25da88665a65d2890fc8af18ab559c55fc4c67f33b56"} Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.198437 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"61e60c86-1fae-4b73-9c2c-bb5bdd108630","Type":"ContainerStarted","Data":"fc725cce789bf9773dc9ad81c60dcf0c264764a304f5f74af131db4f72db2a4c"} Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.199438 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.208230 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-5sg59" event={"ID":"03352dd8-a07d-4822-adcb-64517cb16b2e","Type":"ContainerDied","Data":"6bcdd131a08cd0c20a53527616c28bf91c743559589348c37fe7ab6cceaa4743"} Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.209084 4848 generic.go:334] "Generic (PLEG): container finished" podID="03352dd8-a07d-4822-adcb-64517cb16b2e" containerID="6bcdd131a08cd0c20a53527616c28bf91c743559589348c37fe7ab6cceaa4743" exitCode=0 Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.239433 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ec6fc2a8-0d31-49f6-98be-60e56583631c" containerName="cinder-scheduler" containerID="cri-o://b2efa2e5aa05abf13e893ea4a3903cac7e2ba3d32b0068de4b60c7c28e364a6f" gracePeriod=30 Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.239725 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ec6fc2a8-0d31-49f6-98be-60e56583631c" containerName="probe" containerID="cri-o://7d4d795618e6a64a2089b08aa686ddfd43fda77d755214206db297d0c3a570ec" gracePeriod=30 Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.362436 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.36241403 podStartE2EDuration="6.36241403s" podCreationTimestamp="2025-12-06 15:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:33.277609936 +0000 UTC m=+1240.575620849" watchObservedRunningTime="2025-12-06 15:49:33.36241403 +0000 UTC m=+1240.660424943" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.368270 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.452386 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-59575bb9d8-57gb5"] Dec 06 15:49:33 crc kubenswrapper[4848]: E1206 15:49:33.453131 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7773ec39-baea-46cd-bd39-520ba343805d" containerName="dnsmasq-dns" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.453151 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7773ec39-baea-46cd-bd39-520ba343805d" containerName="dnsmasq-dns" Dec 06 15:49:33 crc kubenswrapper[4848]: E1206 15:49:33.453178 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7773ec39-baea-46cd-bd39-520ba343805d" containerName="init" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.453185 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7773ec39-baea-46cd-bd39-520ba343805d" containerName="init" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.453596 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7773ec39-baea-46cd-bd39-520ba343805d" containerName="dnsmasq-dns" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.458392 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.461984 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.462776 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.472486 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-59575bb9d8-57gb5"] Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.479436 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-swift-storage-0\") pod \"7773ec39-baea-46cd-bd39-520ba343805d\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.479506 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-svc\") pod \"7773ec39-baea-46cd-bd39-520ba343805d\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.479564 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8v8p\" (UniqueName: \"kubernetes.io/projected/7773ec39-baea-46cd-bd39-520ba343805d-kube-api-access-x8v8p\") pod \"7773ec39-baea-46cd-bd39-520ba343805d\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.479616 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-nb\") pod \"7773ec39-baea-46cd-bd39-520ba343805d\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.479649 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-config\") pod \"7773ec39-baea-46cd-bd39-520ba343805d\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.479673 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-sb\") pod \"7773ec39-baea-46cd-bd39-520ba343805d\" (UID: \"7773ec39-baea-46cd-bd39-520ba343805d\") " Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.504234 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7773ec39-baea-46cd-bd39-520ba343805d-kube-api-access-x8v8p" (OuterVolumeSpecName: "kube-api-access-x8v8p") pod "7773ec39-baea-46cd-bd39-520ba343805d" (UID: "7773ec39-baea-46cd-bd39-520ba343805d"). InnerVolumeSpecName "kube-api-access-x8v8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.578521 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7773ec39-baea-46cd-bd39-520ba343805d" (UID: "7773ec39-baea-46cd-bd39-520ba343805d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.581643 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/abe62341-68ac-438b-8aa5-4b0067c8c9ea-config-data-merged\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.581684 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-config-data\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.581777 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-combined-ca-bundle\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.581896 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe62341-68ac-438b-8aa5-4b0067c8c9ea-logs\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.582099 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/abe62341-68ac-438b-8aa5-4b0067c8c9ea-etc-podinfo\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.582127 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd6x7\" (UniqueName: \"kubernetes.io/projected/abe62341-68ac-438b-8aa5-4b0067c8c9ea-kube-api-access-zd6x7\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.582154 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-internal-tls-certs\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.582202 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-config-data-custom\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.582237 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-public-tls-certs\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.582278 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-scripts\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.582335 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.582447 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8v8p\" (UniqueName: \"kubernetes.io/projected/7773ec39-baea-46cd-bd39-520ba343805d-kube-api-access-x8v8p\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.599952 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7773ec39-baea-46cd-bd39-520ba343805d" (UID: "7773ec39-baea-46cd-bd39-520ba343805d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.611275 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7773ec39-baea-46cd-bd39-520ba343805d" (UID: "7773ec39-baea-46cd-bd39-520ba343805d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.612193 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7773ec39-baea-46cd-bd39-520ba343805d" (UID: "7773ec39-baea-46cd-bd39-520ba343805d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.643890 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-config" (OuterVolumeSpecName: "config") pod "7773ec39-baea-46cd-bd39-520ba343805d" (UID: "7773ec39-baea-46cd-bd39-520ba343805d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685052 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-config-data-custom\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685135 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-public-tls-certs\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685192 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-scripts\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685259 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/abe62341-68ac-438b-8aa5-4b0067c8c9ea-config-data-merged\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685304 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-config-data\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685339 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-combined-ca-bundle\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685404 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe62341-68ac-438b-8aa5-4b0067c8c9ea-logs\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685429 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/abe62341-68ac-438b-8aa5-4b0067c8c9ea-etc-podinfo\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685453 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd6x7\" (UniqueName: \"kubernetes.io/projected/abe62341-68ac-438b-8aa5-4b0067c8c9ea-kube-api-access-zd6x7\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685481 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-internal-tls-certs\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685558 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685572 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685585 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.685596 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7773ec39-baea-46cd-bd39-520ba343805d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.687466 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe62341-68ac-438b-8aa5-4b0067c8c9ea-logs\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.690984 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-internal-tls-certs\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.692826 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/abe62341-68ac-438b-8aa5-4b0067c8c9ea-config-data-merged\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.694103 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.701486 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-config-data-custom\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.705359 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-config-data\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.705732 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-public-tls-certs\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.707124 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/abe62341-68ac-438b-8aa5-4b0067c8c9ea-etc-podinfo\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.707618 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-combined-ca-bundle\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: W1206 15:49:33.710054 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod615e4244_19fe_4703_85b3_86086a3e630d.slice/crio-f693dd52d5f3741464a82d5b54e07c7df6630bd9fdafe772d19d32acb58c4aa5 WatchSource:0}: Error finding container f693dd52d5f3741464a82d5b54e07c7df6630bd9fdafe772d19d32acb58c4aa5: Status 404 returned error can't find the container with id f693dd52d5f3741464a82d5b54e07c7df6630bd9fdafe772d19d32acb58c4aa5 Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.710378 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd6x7\" (UniqueName: \"kubernetes.io/projected/abe62341-68ac-438b-8aa5-4b0067c8c9ea-kube-api-access-zd6x7\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.713471 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe62341-68ac-438b-8aa5-4b0067c8c9ea-scripts\") pod \"ironic-59575bb9d8-57gb5\" (UID: \"abe62341-68ac-438b-8aa5-4b0067c8c9ea\") " pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.720902 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.831278 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-76qm4"] Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.833154 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.835866 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.836235 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6lzkc" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.839440 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 06 15:49:33 crc kubenswrapper[4848]: I1206 15:49:33.853176 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-76qm4"] Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.002638 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-config-data\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.007443 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-scripts\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.007598 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2ptl\" (UniqueName: \"kubernetes.io/projected/95a42d59-df8c-420d-bb24-c8476a868dd9-kube-api-access-k2ptl\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.007830 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.108899 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.108987 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-config-data\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.109025 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-scripts\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.109088 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2ptl\" (UniqueName: \"kubernetes.io/projected/95a42d59-df8c-420d-bb24-c8476a868dd9-kube-api-access-k2ptl\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.117353 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.117378 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-scripts\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.117897 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-config-data\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.133979 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2ptl\" (UniqueName: \"kubernetes.io/projected/95a42d59-df8c-420d-bb24-c8476a868dd9-kube-api-access-k2ptl\") pod \"nova-cell0-conductor-db-sync-76qm4\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.153985 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.234151 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" event={"ID":"7773ec39-baea-46cd-bd39-520ba343805d","Type":"ContainerDied","Data":"bf7d408da9660c368f5d99a4561e58c703c5c746c0cddf278dd1906eb236e34b"} Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.234206 4848 scope.go:117] "RemoveContainer" containerID="b1b36c9f28117d863bdf25da88665a65d2890fc8af18ab559c55fc4c67f33b56" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.234580 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-kp76h" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.240391 4848 generic.go:334] "Generic (PLEG): container finished" podID="69d0d42e-a46f-49c4-a637-81da68446876" containerID="d1db1c5f39966e4065b63ca4719f2e1b399ba6f6336fd2f892f6a886f7a61be9" exitCode=0 Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.240615 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-572c-account-create-update-j47lb" event={"ID":"69d0d42e-a46f-49c4-a637-81da68446876","Type":"ContainerDied","Data":"d1db1c5f39966e4065b63ca4719f2e1b399ba6f6336fd2f892f6a886f7a61be9"} Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.243591 4848 generic.go:334] "Generic (PLEG): container finished" podID="ab198686-7839-4e39-abdb-ea9b65893a02" containerID="aee1cf05686d0a385770beff7c36b40a405a3c4aecae26655f284fd8378c5485" exitCode=0 Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.243660 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"ab198686-7839-4e39-abdb-ea9b65893a02","Type":"ContainerDied","Data":"aee1cf05686d0a385770beff7c36b40a405a3c4aecae26655f284fd8378c5485"} Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.252288 4848 generic.go:334] "Generic (PLEG): container finished" podID="ec6fc2a8-0d31-49f6-98be-60e56583631c" containerID="7d4d795618e6a64a2089b08aa686ddfd43fda77d755214206db297d0c3a570ec" exitCode=0 Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.252363 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec6fc2a8-0d31-49f6-98be-60e56583631c","Type":"ContainerDied","Data":"7d4d795618e6a64a2089b08aa686ddfd43fda77d755214206db297d0c3a570ec"} Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.261885 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"615e4244-19fe-4703-85b3-86086a3e630d","Type":"ContainerStarted","Data":"f693dd52d5f3741464a82d5b54e07c7df6630bd9fdafe772d19d32acb58c4aa5"} Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.310512 4848 scope.go:117] "RemoveContainer" containerID="8a2a638de6128b516c16d440911fd567c590742ae834c9213eb90b297bfc2179" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.311087 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-59575bb9d8-57gb5"] Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.323511 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-kp76h"] Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.334342 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-kp76h"] Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.825399 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-572c-account-create-update-j47lb" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.829555 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kld66\" (UniqueName: \"kubernetes.io/projected/69d0d42e-a46f-49c4-a637-81da68446876-kube-api-access-kld66\") pod \"69d0d42e-a46f-49c4-a637-81da68446876\" (UID: \"69d0d42e-a46f-49c4-a637-81da68446876\") " Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.829759 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69d0d42e-a46f-49c4-a637-81da68446876-operator-scripts\") pod \"69d0d42e-a46f-49c4-a637-81da68446876\" (UID: \"69d0d42e-a46f-49c4-a637-81da68446876\") " Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.830245 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d0d42e-a46f-49c4-a637-81da68446876-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69d0d42e-a46f-49c4-a637-81da68446876" (UID: "69d0d42e-a46f-49c4-a637-81da68446876"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.830380 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69d0d42e-a46f-49c4-a637-81da68446876-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.845811 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d0d42e-a46f-49c4-a637-81da68446876-kube-api-access-kld66" (OuterVolumeSpecName: "kube-api-access-kld66") pod "69d0d42e-a46f-49c4-a637-81da68446876" (UID: "69d0d42e-a46f-49c4-a637-81da68446876"). InnerVolumeSpecName "kube-api-access-kld66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.918322 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-76qm4"] Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.936752 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kld66\" (UniqueName: \"kubernetes.io/projected/69d0d42e-a46f-49c4-a637-81da68446876-kube-api-access-kld66\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:34 crc kubenswrapper[4848]: I1206 15:49:34.982746 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7773ec39-baea-46cd-bd39-520ba343805d" path="/var/lib/kubelet/pods/7773ec39-baea-46cd-bd39-520ba343805d/volumes" Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.287969 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-572c-account-create-update-j47lb" Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.287966 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-572c-account-create-update-j47lb" event={"ID":"69d0d42e-a46f-49c4-a637-81da68446876","Type":"ContainerDied","Data":"9db133563f492efd945e7d08c68087da88c9986fd92e4025b8585eb29090ae97"} Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.288407 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9db133563f492efd945e7d08c68087da88c9986fd92e4025b8585eb29090ae97" Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.289559 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59575bb9d8-57gb5" event={"ID":"abe62341-68ac-438b-8aa5-4b0067c8c9ea","Type":"ContainerStarted","Data":"70a36c2bbd2ca6d7080a8967f7cedc25d5c40bf92acfe19e41fc9de5d6e1150c"} Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.295263 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"615e4244-19fe-4703-85b3-86086a3e630d","Type":"ContainerStarted","Data":"d88ad6dd118a52d37f98ccf40f244e7afa3d07000a4d03ae84afeda58aa899e5"} Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.458163 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-5sg59" Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.653419 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03352dd8-a07d-4822-adcb-64517cb16b2e-operator-scripts\") pod \"03352dd8-a07d-4822-adcb-64517cb16b2e\" (UID: \"03352dd8-a07d-4822-adcb-64517cb16b2e\") " Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.653981 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03352dd8-a07d-4822-adcb-64517cb16b2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03352dd8-a07d-4822-adcb-64517cb16b2e" (UID: "03352dd8-a07d-4822-adcb-64517cb16b2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.654122 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qjg9\" (UniqueName: \"kubernetes.io/projected/03352dd8-a07d-4822-adcb-64517cb16b2e-kube-api-access-4qjg9\") pod \"03352dd8-a07d-4822-adcb-64517cb16b2e\" (UID: \"03352dd8-a07d-4822-adcb-64517cb16b2e\") " Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.655396 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03352dd8-a07d-4822-adcb-64517cb16b2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.661977 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03352dd8-a07d-4822-adcb-64517cb16b2e-kube-api-access-4qjg9" (OuterVolumeSpecName: "kube-api-access-4qjg9") pod "03352dd8-a07d-4822-adcb-64517cb16b2e" (UID: "03352dd8-a07d-4822-adcb-64517cb16b2e"). InnerVolumeSpecName "kube-api-access-4qjg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:35 crc kubenswrapper[4848]: I1206 15:49:35.759611 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qjg9\" (UniqueName: \"kubernetes.io/projected/03352dd8-a07d-4822-adcb-64517cb16b2e-kube-api-access-4qjg9\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.311477 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-5sg59" event={"ID":"03352dd8-a07d-4822-adcb-64517cb16b2e","Type":"ContainerDied","Data":"adf8c38c49c4ae1e78fe43b4be6aa1f2abd586c2601bb7cc16422ec178d3d0d5"} Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.311801 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adf8c38c49c4ae1e78fe43b4be6aa1f2abd586c2601bb7cc16422ec178d3d0d5" Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.311878 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-5sg59" Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.324760 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-76qm4" event={"ID":"95a42d59-df8c-420d-bb24-c8476a868dd9","Type":"ContainerStarted","Data":"0deb40f6f166e9a45d629f5b9ca7f976b01796555e2e64b7ceb91f671a89d440"} Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.336180 4848 generic.go:334] "Generic (PLEG): container finished" podID="ec6fc2a8-0d31-49f6-98be-60e56583631c" containerID="b2efa2e5aa05abf13e893ea4a3903cac7e2ba3d32b0068de4b60c7c28e364a6f" exitCode=0 Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.336230 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec6fc2a8-0d31-49f6-98be-60e56583631c","Type":"ContainerDied","Data":"b2efa2e5aa05abf13e893ea4a3903cac7e2ba3d32b0068de4b60c7c28e364a6f"} Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.807199 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.807517 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.815810 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.934239 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.936441 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.940581 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.940719 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.987310 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data-custom\") pod \"ec6fc2a8-0d31-49f6-98be-60e56583631c\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.987573 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-scripts\") pod \"ec6fc2a8-0d31-49f6-98be-60e56583631c\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.988626 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data\") pod \"ec6fc2a8-0d31-49f6-98be-60e56583631c\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.990795 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-combined-ca-bundle\") pod \"ec6fc2a8-0d31-49f6-98be-60e56583631c\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.991202 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cmcq\" (UniqueName: \"kubernetes.io/projected/ec6fc2a8-0d31-49f6-98be-60e56583631c-kube-api-access-4cmcq\") pod \"ec6fc2a8-0d31-49f6-98be-60e56583631c\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.991315 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6fc2a8-0d31-49f6-98be-60e56583631c-etc-machine-id\") pod \"ec6fc2a8-0d31-49f6-98be-60e56583631c\" (UID: \"ec6fc2a8-0d31-49f6-98be-60e56583631c\") " Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.994474 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec6fc2a8-0d31-49f6-98be-60e56583631c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ec6fc2a8-0d31-49f6-98be-60e56583631c" (UID: "ec6fc2a8-0d31-49f6-98be-60e56583631c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 15:49:36 crc kubenswrapper[4848]: I1206 15:49:36.996211 4848 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6fc2a8-0d31-49f6-98be-60e56583631c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.001384 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-scripts" (OuterVolumeSpecName: "scripts") pod "ec6fc2a8-0d31-49f6-98be-60e56583631c" (UID: "ec6fc2a8-0d31-49f6-98be-60e56583631c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.015254 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ec6fc2a8-0d31-49f6-98be-60e56583631c" (UID: "ec6fc2a8-0d31-49f6-98be-60e56583631c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.017741 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec6fc2a8-0d31-49f6-98be-60e56583631c-kube-api-access-4cmcq" (OuterVolumeSpecName: "kube-api-access-4cmcq") pod "ec6fc2a8-0d31-49f6-98be-60e56583631c" (UID: "ec6fc2a8-0d31-49f6-98be-60e56583631c"). InnerVolumeSpecName "kube-api-access-4cmcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.097893 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cmcq\" (UniqueName: \"kubernetes.io/projected/ec6fc2a8-0d31-49f6-98be-60e56583631c-kube-api-access-4cmcq\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.097925 4848 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.097934 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.166827 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.166882 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.245826 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec6fc2a8-0d31-49f6-98be-60e56583631c" (UID: "ec6fc2a8-0d31-49f6-98be-60e56583631c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.285964 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data" (OuterVolumeSpecName: "config-data") pod "ec6fc2a8-0d31-49f6-98be-60e56583631c" (UID: "ec6fc2a8-0d31-49f6-98be-60e56583631c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.302678 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.302722 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6fc2a8-0d31-49f6-98be-60e56583631c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.369098 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" event={"ID":"692f44d3-ff17-419f-b16c-b37f71521603","Type":"ContainerStarted","Data":"d5e79f67375339f06f4b4aa90450db97c503c894c63849c7f0779c4e99d5327e"} Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.369736 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.402348 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.402374 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec6fc2a8-0d31-49f6-98be-60e56583631c","Type":"ContainerDied","Data":"c5aa8039fc37400349f7445e070b6edbaf27722a959a32a199ce78d5a8f3c350"} Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.402430 4848 scope.go:117] "RemoveContainer" containerID="7d4d795618e6a64a2089b08aa686ddfd43fda77d755214206db297d0c3a570ec" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.409266 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" podStartSLOduration=3.383836516 podStartE2EDuration="8.409238851s" podCreationTimestamp="2025-12-06 15:49:29 +0000 UTC" firstStartedPulling="2025-12-06 15:49:31.408099606 +0000 UTC m=+1238.706110519" lastFinishedPulling="2025-12-06 15:49:36.433501931 +0000 UTC m=+1243.731512854" observedRunningTime="2025-12-06 15:49:37.386912847 +0000 UTC m=+1244.684923760" watchObservedRunningTime="2025-12-06 15:49:37.409238851 +0000 UTC m=+1244.707249774" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.456453 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7f6646d5b4-tzftd" event={"ID":"cd7ef2f0-4fc0-4e48-a862-7818d1989187","Type":"ContainerStarted","Data":"dab15829eb23d2cb75e4070b226de1894586dc02b6fe99948bce5a5169556de0"} Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.464387 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"615e4244-19fe-4703-85b3-86086a3e630d","Type":"ContainerStarted","Data":"209527aed1edda63466bd05ad7d617d8e7c6b3291c87fca7f9c692014030509b"} Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.483383 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"28b28ed8-c6be-4256-8ccd-8c560959048b","Type":"ContainerStarted","Data":"f2337555ded7b0af5d9ef80f2b88604b79e5e1e513cc7e68ed8ccb8a73813750"} Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.490493 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59575bb9d8-57gb5" event={"ID":"abe62341-68ac-438b-8aa5-4b0067c8c9ea","Type":"ContainerStarted","Data":"e12a3631ca2865bcf0d6af4d3bae7d3d7afb10bf271a388c388145a76b60d95e"} Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.491382 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.491423 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.491755 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.492267 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.527105 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.131167638 podStartE2EDuration="35.527082069s" podCreationTimestamp="2025-12-06 15:49:02 +0000 UTC" firstStartedPulling="2025-12-06 15:49:04.900244335 +0000 UTC m=+1212.198255238" lastFinishedPulling="2025-12-06 15:49:36.296158756 +0000 UTC m=+1243.594169669" observedRunningTime="2025-12-06 15:49:37.511100487 +0000 UTC m=+1244.809111400" watchObservedRunningTime="2025-12-06 15:49:37.527082069 +0000 UTC m=+1244.825092992" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.540930 4848 scope.go:117] "RemoveContainer" containerID="b2efa2e5aa05abf13e893ea4a3903cac7e2ba3d32b0068de4b60c7c28e364a6f" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.565279 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.570774 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.595768 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 15:49:37 crc kubenswrapper[4848]: E1206 15:49:37.596289 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03352dd8-a07d-4822-adcb-64517cb16b2e" containerName="mariadb-database-create" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.596307 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="03352dd8-a07d-4822-adcb-64517cb16b2e" containerName="mariadb-database-create" Dec 06 15:49:37 crc kubenswrapper[4848]: E1206 15:49:37.596337 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6fc2a8-0d31-49f6-98be-60e56583631c" containerName="probe" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.596346 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6fc2a8-0d31-49f6-98be-60e56583631c" containerName="probe" Dec 06 15:49:37 crc kubenswrapper[4848]: E1206 15:49:37.596361 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d0d42e-a46f-49c4-a637-81da68446876" containerName="mariadb-account-create-update" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.596369 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d0d42e-a46f-49c4-a637-81da68446876" containerName="mariadb-account-create-update" Dec 06 15:49:37 crc kubenswrapper[4848]: E1206 15:49:37.596382 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6fc2a8-0d31-49f6-98be-60e56583631c" containerName="cinder-scheduler" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.596390 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6fc2a8-0d31-49f6-98be-60e56583631c" containerName="cinder-scheduler" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.596627 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d0d42e-a46f-49c4-a637-81da68446876" containerName="mariadb-account-create-update" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.596643 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec6fc2a8-0d31-49f6-98be-60e56583631c" containerName="cinder-scheduler" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.596662 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="03352dd8-a07d-4822-adcb-64517cb16b2e" containerName="mariadb-database-create" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.596680 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec6fc2a8-0d31-49f6-98be-60e56583631c" containerName="probe" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.597947 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.602130 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.659677 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.716601 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-scripts\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.716643 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-config-data\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.716779 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzqfd\" (UniqueName: \"kubernetes.io/projected/d5c9c312-22cc-49cf-b342-247cfd7b1906-kube-api-access-kzqfd\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.716872 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.716934 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5c9c312-22cc-49cf-b342-247cfd7b1906-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.716964 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.819093 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzqfd\" (UniqueName: \"kubernetes.io/projected/d5c9c312-22cc-49cf-b342-247cfd7b1906-kube-api-access-kzqfd\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.819190 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.819244 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5c9c312-22cc-49cf-b342-247cfd7b1906-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.819270 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.819372 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-scripts\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.819392 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-config-data\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.819931 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5c9c312-22cc-49cf-b342-247cfd7b1906-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.832469 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-scripts\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.842291 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.847722 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.847890 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzqfd\" (UniqueName: \"kubernetes.io/projected/d5c9c312-22cc-49cf-b342-247cfd7b1906-kube-api-access-kzqfd\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.848137 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c9c312-22cc-49cf-b342-247cfd7b1906-config-data\") pod \"cinder-scheduler-0\" (UID: \"d5c9c312-22cc-49cf-b342-247cfd7b1906\") " pod="openstack/cinder-scheduler-0" Dec 06 15:49:37 crc kubenswrapper[4848]: I1206 15:49:37.940359 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 15:49:38 crc kubenswrapper[4848]: I1206 15:49:38.521401 4848 generic.go:334] "Generic (PLEG): container finished" podID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerID="dab15829eb23d2cb75e4070b226de1894586dc02b6fe99948bce5a5169556de0" exitCode=0 Dec 06 15:49:38 crc kubenswrapper[4848]: I1206 15:49:38.521476 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7f6646d5b4-tzftd" event={"ID":"cd7ef2f0-4fc0-4e48-a862-7818d1989187","Type":"ContainerDied","Data":"dab15829eb23d2cb75e4070b226de1894586dc02b6fe99948bce5a5169556de0"} Dec 06 15:49:38 crc kubenswrapper[4848]: I1206 15:49:38.537271 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"615e4244-19fe-4703-85b3-86086a3e630d","Type":"ContainerStarted","Data":"e0bc9e287c3cc42a6a298e11838fa4bbe6d24d742c003188cec870442966806c"} Dec 06 15:49:38 crc kubenswrapper[4848]: I1206 15:49:38.543887 4848 generic.go:334] "Generic (PLEG): container finished" podID="abe62341-68ac-438b-8aa5-4b0067c8c9ea" containerID="e12a3631ca2865bcf0d6af4d3bae7d3d7afb10bf271a388c388145a76b60d95e" exitCode=0 Dec 06 15:49:38 crc kubenswrapper[4848]: I1206 15:49:38.544630 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59575bb9d8-57gb5" event={"ID":"abe62341-68ac-438b-8aa5-4b0067c8c9ea","Type":"ContainerDied","Data":"e12a3631ca2865bcf0d6af4d3bae7d3d7afb10bf271a388c388145a76b60d95e"} Dec 06 15:49:38 crc kubenswrapper[4848]: I1206 15:49:38.556261 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 15:49:38 crc kubenswrapper[4848]: I1206 15:49:38.981689 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec6fc2a8-0d31-49f6-98be-60e56583631c" path="/var/lib/kubelet/pods/ec6fc2a8-0d31-49f6-98be-60e56583631c/volumes" Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.572534 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d5c9c312-22cc-49cf-b342-247cfd7b1906","Type":"ContainerStarted","Data":"0b39df0db0128415f46c613b12b7e9a99cdeea373b6ba357bc38ded29989067c"} Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.593089 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59575bb9d8-57gb5" event={"ID":"abe62341-68ac-438b-8aa5-4b0067c8c9ea","Type":"ContainerStarted","Data":"c3ca01eb28f9cf75c2ecef9f576637e8fa4a7e0142ec418df6e9340d9b0a1faa"} Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.593137 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59575bb9d8-57gb5" event={"ID":"abe62341-68ac-438b-8aa5-4b0067c8c9ea","Type":"ContainerStarted","Data":"23347a990c30e713543d4906bb526af2740ada942cbdbd0a8137f11064ca8e60"} Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.594463 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.614072 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.614090 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.615501 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7f6646d5b4-tzftd" event={"ID":"cd7ef2f0-4fc0-4e48-a862-7818d1989187","Type":"ContainerStarted","Data":"7a6d82e2e9a37041bab589a6f1ad402baac51fcc59f88caf87f6c40a812a2e76"} Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.615534 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.615596 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.615604 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.644085 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-59575bb9d8-57gb5" podStartSLOduration=4.394888944 podStartE2EDuration="6.644067176s" podCreationTimestamp="2025-12-06 15:49:33 +0000 UTC" firstStartedPulling="2025-12-06 15:49:34.340163415 +0000 UTC m=+1241.638174328" lastFinishedPulling="2025-12-06 15:49:36.589341647 +0000 UTC m=+1243.887352560" observedRunningTime="2025-12-06 15:49:39.635480144 +0000 UTC m=+1246.933491077" watchObservedRunningTime="2025-12-06 15:49:39.644067176 +0000 UTC m=+1246.942078089" Dec 06 15:49:39 crc kubenswrapper[4848]: I1206 15:49:39.716134 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-7f6646d5b4-tzftd" podStartSLOduration=5.706138105 podStartE2EDuration="10.716116995s" podCreationTimestamp="2025-12-06 15:49:29 +0000 UTC" firstStartedPulling="2025-12-06 15:49:31.424369885 +0000 UTC m=+1238.722380798" lastFinishedPulling="2025-12-06 15:49:36.434348775 +0000 UTC m=+1243.732359688" observedRunningTime="2025-12-06 15:49:39.667798568 +0000 UTC m=+1246.965809491" watchObservedRunningTime="2025-12-06 15:49:39.716116995 +0000 UTC m=+1247.014127908" Dec 06 15:49:40 crc kubenswrapper[4848]: I1206 15:49:40.294191 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:40 crc kubenswrapper[4848]: I1206 15:49:40.365715 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:40 crc kubenswrapper[4848]: I1206 15:49:40.369857 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 15:49:40 crc kubenswrapper[4848]: I1206 15:49:40.629796 4848 generic.go:334] "Generic (PLEG): container finished" podID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerID="6132d5a8c1fec9ff3a07f080b1db9bc58ca27b6f18f686b3b416053e75ca3e98" exitCode=1 Dec 06 15:49:40 crc kubenswrapper[4848]: I1206 15:49:40.629893 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7f6646d5b4-tzftd" event={"ID":"cd7ef2f0-4fc0-4e48-a862-7818d1989187","Type":"ContainerDied","Data":"6132d5a8c1fec9ff3a07f080b1db9bc58ca27b6f18f686b3b416053e75ca3e98"} Dec 06 15:49:40 crc kubenswrapper[4848]: I1206 15:49:40.630664 4848 scope.go:117] "RemoveContainer" containerID="6132d5a8c1fec9ff3a07f080b1db9bc58ca27b6f18f686b3b416053e75ca3e98" Dec 06 15:49:40 crc kubenswrapper[4848]: I1206 15:49:40.636145 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d5c9c312-22cc-49cf-b342-247cfd7b1906","Type":"ContainerStarted","Data":"81cbe22e80f19ba7a25eb03b43aa35953dc2f7fd833d44ef3426cb4837ac7baa"} Dec 06 15:49:40 crc kubenswrapper[4848]: I1206 15:49:40.702999 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 15:49:40 crc kubenswrapper[4848]: I1206 15:49:40.703089 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 15:49:41 crc kubenswrapper[4848]: I1206 15:49:41.268143 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 15:49:41 crc kubenswrapper[4848]: I1206 15:49:41.667231 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7f6646d5b4-tzftd" event={"ID":"cd7ef2f0-4fc0-4e48-a862-7818d1989187","Type":"ContainerStarted","Data":"064455e5924a3ad2d64a88b28ab13604cb55157d62baa6b923c8e78e74aee3d7"} Dec 06 15:49:41 crc kubenswrapper[4848]: I1206 15:49:41.669752 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:41 crc kubenswrapper[4848]: I1206 15:49:41.692176 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"615e4244-19fe-4703-85b3-86086a3e630d","Type":"ContainerStarted","Data":"fbbe98d9b3d20c0140f2140d8bc2834ea79d8ac45d2a4b9ad623cf034626ca05"} Dec 06 15:49:41 crc kubenswrapper[4848]: I1206 15:49:41.692775 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 15:49:41 crc kubenswrapper[4848]: I1206 15:49:41.696944 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d5c9c312-22cc-49cf-b342-247cfd7b1906","Type":"ContainerStarted","Data":"270043fd2f462f5b0c684ec1e41143bad813776f652dc87cdd84f4f0168d8534"} Dec 06 15:49:41 crc kubenswrapper[4848]: I1206 15:49:41.726394 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.46985264 podStartE2EDuration="9.726377464s" podCreationTimestamp="2025-12-06 15:49:32 +0000 UTC" firstStartedPulling="2025-12-06 15:49:33.721438555 +0000 UTC m=+1241.019449468" lastFinishedPulling="2025-12-06 15:49:39.977963379 +0000 UTC m=+1247.275974292" observedRunningTime="2025-12-06 15:49:41.711722818 +0000 UTC m=+1249.009733731" watchObservedRunningTime="2025-12-06 15:49:41.726377464 +0000 UTC m=+1249.024388377" Dec 06 15:49:41 crc kubenswrapper[4848]: I1206 15:49:41.754148 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.754126935 podStartE2EDuration="4.754126935s" podCreationTimestamp="2025-12-06 15:49:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:49:41.732897271 +0000 UTC m=+1249.030908184" watchObservedRunningTime="2025-12-06 15:49:41.754126935 +0000 UTC m=+1249.052137858" Dec 06 15:49:42 crc kubenswrapper[4848]: I1206 15:49:42.061711 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 06 15:49:42 crc kubenswrapper[4848]: I1206 15:49:42.722509 4848 generic.go:334] "Generic (PLEG): container finished" podID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerID="064455e5924a3ad2d64a88b28ab13604cb55157d62baa6b923c8e78e74aee3d7" exitCode=1 Dec 06 15:49:42 crc kubenswrapper[4848]: I1206 15:49:42.723366 4848 scope.go:117] "RemoveContainer" containerID="064455e5924a3ad2d64a88b28ab13604cb55157d62baa6b923c8e78e74aee3d7" Dec 06 15:49:42 crc kubenswrapper[4848]: E1206 15:49:42.723569 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-7f6646d5b4-tzftd_openstack(cd7ef2f0-4fc0-4e48-a862-7818d1989187)\"" pod="openstack/ironic-7f6646d5b4-tzftd" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" Dec 06 15:49:42 crc kubenswrapper[4848]: I1206 15:49:42.724204 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7f6646d5b4-tzftd" event={"ID":"cd7ef2f0-4fc0-4e48-a862-7818d1989187","Type":"ContainerDied","Data":"064455e5924a3ad2d64a88b28ab13604cb55157d62baa6b923c8e78e74aee3d7"} Dec 06 15:49:42 crc kubenswrapper[4848]: I1206 15:49:42.724345 4848 scope.go:117] "RemoveContainer" containerID="6132d5a8c1fec9ff3a07f080b1db9bc58ca27b6f18f686b3b416053e75ca3e98" Dec 06 15:49:42 crc kubenswrapper[4848]: I1206 15:49:42.941192 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 15:49:43 crc kubenswrapper[4848]: I1206 15:49:43.742887 4848 generic.go:334] "Generic (PLEG): container finished" podID="692f44d3-ff17-419f-b16c-b37f71521603" containerID="d5e79f67375339f06f4b4aa90450db97c503c894c63849c7f0779c4e99d5327e" exitCode=1 Dec 06 15:49:43 crc kubenswrapper[4848]: I1206 15:49:43.742973 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" event={"ID":"692f44d3-ff17-419f-b16c-b37f71521603","Type":"ContainerDied","Data":"d5e79f67375339f06f4b4aa90450db97c503c894c63849c7f0779c4e99d5327e"} Dec 06 15:49:43 crc kubenswrapper[4848]: I1206 15:49:43.743603 4848 scope.go:117] "RemoveContainer" containerID="d5e79f67375339f06f4b4aa90450db97c503c894c63849c7f0779c4e99d5327e" Dec 06 15:49:43 crc kubenswrapper[4848]: I1206 15:49:43.744887 4848 scope.go:117] "RemoveContainer" containerID="064455e5924a3ad2d64a88b28ab13604cb55157d62baa6b923c8e78e74aee3d7" Dec 06 15:49:43 crc kubenswrapper[4848]: E1206 15:49:43.746173 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-7f6646d5b4-tzftd_openstack(cd7ef2f0-4fc0-4e48-a862-7818d1989187)\"" pod="openstack/ironic-7f6646d5b4-tzftd" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.757493 4848 scope.go:117] "RemoveContainer" containerID="064455e5924a3ad2d64a88b28ab13604cb55157d62baa6b923c8e78e74aee3d7" Dec 06 15:49:44 crc kubenswrapper[4848]: E1206 15:49:44.757682 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-7f6646d5b4-tzftd_openstack(cd7ef2f0-4fc0-4e48-a862-7818d1989187)\"" pod="openstack/ironic-7f6646d5b4-tzftd" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.757761 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-6xvjg"] Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.759240 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.765883 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-6xvjg"] Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.807552 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.807639 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.910773 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fb815e0a-f0ff-40d4-b1c4-1220b71db056-etc-podinfo\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.910873 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-config\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.910901 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.911327 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-scripts\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.911424 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjtnn\" (UniqueName: \"kubernetes.io/projected/fb815e0a-f0ff-40d4-b1c4-1220b71db056-kube-api-access-jjtnn\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.911458 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:44 crc kubenswrapper[4848]: I1206 15:49:44.911737 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-combined-ca-bundle\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.013615 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-config\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.013690 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.013754 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-scripts\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.013788 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjtnn\" (UniqueName: \"kubernetes.io/projected/fb815e0a-f0ff-40d4-b1c4-1220b71db056-kube-api-access-jjtnn\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.013839 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.013948 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-combined-ca-bundle\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.014120 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fb815e0a-f0ff-40d4-b1c4-1220b71db056-etc-podinfo\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.015050 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.015588 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.022847 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fb815e0a-f0ff-40d4-b1c4-1220b71db056-etc-podinfo\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.023043 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-config\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.033296 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-scripts\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.039069 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-combined-ca-bundle\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.044344 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjtnn\" (UniqueName: \"kubernetes.io/projected/fb815e0a-f0ff-40d4-b1c4-1220b71db056-kube-api-access-jjtnn\") pod \"ironic-inspector-db-sync-6xvjg\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.132212 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.237081 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.237131 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.292881 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-59575bb9d8-57gb5" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.299494 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.354274 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-7f6646d5b4-tzftd"] Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.766350 4848 scope.go:117] "RemoveContainer" containerID="064455e5924a3ad2d64a88b28ab13604cb55157d62baa6b923c8e78e74aee3d7" Dec 06 15:49:45 crc kubenswrapper[4848]: E1206 15:49:45.766729 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-7f6646d5b4-tzftd_openstack(cd7ef2f0-4fc0-4e48-a862-7818d1989187)\"" pod="openstack/ironic-7f6646d5b4-tzftd" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.928387 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.928714 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="ceilometer-central-agent" containerID="cri-o://d88ad6dd118a52d37f98ccf40f244e7afa3d07000a4d03ae84afeda58aa899e5" gracePeriod=30 Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.928885 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="sg-core" containerID="cri-o://e0bc9e287c3cc42a6a298e11838fa4bbe6d24d742c003188cec870442966806c" gracePeriod=30 Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.928977 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="proxy-httpd" containerID="cri-o://fbbe98d9b3d20c0140f2140d8bc2834ea79d8ac45d2a4b9ad623cf034626ca05" gracePeriod=30 Dec 06 15:49:45 crc kubenswrapper[4848]: I1206 15:49:45.928847 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="ceilometer-notification-agent" containerID="cri-o://209527aed1edda63466bd05ad7d617d8e7c6b3291c87fca7f9c692014030509b" gracePeriod=30 Dec 06 15:49:46 crc kubenswrapper[4848]: I1206 15:49:46.776912 4848 generic.go:334] "Generic (PLEG): container finished" podID="615e4244-19fe-4703-85b3-86086a3e630d" containerID="fbbe98d9b3d20c0140f2140d8bc2834ea79d8ac45d2a4b9ad623cf034626ca05" exitCode=0 Dec 06 15:49:46 crc kubenswrapper[4848]: I1206 15:49:46.776966 4848 generic.go:334] "Generic (PLEG): container finished" podID="615e4244-19fe-4703-85b3-86086a3e630d" containerID="e0bc9e287c3cc42a6a298e11838fa4bbe6d24d742c003188cec870442966806c" exitCode=2 Dec 06 15:49:46 crc kubenswrapper[4848]: I1206 15:49:46.776975 4848 generic.go:334] "Generic (PLEG): container finished" podID="615e4244-19fe-4703-85b3-86086a3e630d" containerID="209527aed1edda63466bd05ad7d617d8e7c6b3291c87fca7f9c692014030509b" exitCode=0 Dec 06 15:49:46 crc kubenswrapper[4848]: I1206 15:49:46.776986 4848 generic.go:334] "Generic (PLEG): container finished" podID="615e4244-19fe-4703-85b3-86086a3e630d" containerID="d88ad6dd118a52d37f98ccf40f244e7afa3d07000a4d03ae84afeda58aa899e5" exitCode=0 Dec 06 15:49:46 crc kubenswrapper[4848]: I1206 15:49:46.777011 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"615e4244-19fe-4703-85b3-86086a3e630d","Type":"ContainerDied","Data":"fbbe98d9b3d20c0140f2140d8bc2834ea79d8ac45d2a4b9ad623cf034626ca05"} Dec 06 15:49:46 crc kubenswrapper[4848]: I1206 15:49:46.777110 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"615e4244-19fe-4703-85b3-86086a3e630d","Type":"ContainerDied","Data":"e0bc9e287c3cc42a6a298e11838fa4bbe6d24d742c003188cec870442966806c"} Dec 06 15:49:46 crc kubenswrapper[4848]: I1206 15:49:46.777138 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"615e4244-19fe-4703-85b3-86086a3e630d","Type":"ContainerDied","Data":"209527aed1edda63466bd05ad7d617d8e7c6b3291c87fca7f9c692014030509b"} Dec 06 15:49:46 crc kubenswrapper[4848]: I1206 15:49:46.777147 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"615e4244-19fe-4703-85b3-86086a3e630d","Type":"ContainerDied","Data":"d88ad6dd118a52d37f98ccf40f244e7afa3d07000a4d03ae84afeda58aa899e5"} Dec 06 15:49:46 crc kubenswrapper[4848]: I1206 15:49:46.777246 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-7f6646d5b4-tzftd" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="ironic-api-log" containerID="cri-o://7a6d82e2e9a37041bab589a6f1ad402baac51fcc59f88caf87f6c40a812a2e76" gracePeriod=60 Dec 06 15:49:47 crc kubenswrapper[4848]: I1206 15:49:47.150571 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:49:47 crc kubenswrapper[4848]: I1206 15:49:47.150632 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:49:47 crc kubenswrapper[4848]: I1206 15:49:47.789856 4848 generic.go:334] "Generic (PLEG): container finished" podID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerID="7a6d82e2e9a37041bab589a6f1ad402baac51fcc59f88caf87f6c40a812a2e76" exitCode=143 Dec 06 15:49:47 crc kubenswrapper[4848]: I1206 15:49:47.789902 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7f6646d5b4-tzftd" event={"ID":"cd7ef2f0-4fc0-4e48-a862-7818d1989187","Type":"ContainerDied","Data":"7a6d82e2e9a37041bab589a6f1ad402baac51fcc59f88caf87f6c40a812a2e76"} Dec 06 15:49:48 crc kubenswrapper[4848]: I1206 15:49:48.227598 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 15:49:51 crc kubenswrapper[4848]: I1206 15:49:51.188961 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:49:54 crc kubenswrapper[4848]: I1206 15:49:54.109669 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-6xvjg"] Dec 06 15:49:55 crc kubenswrapper[4848]: W1206 15:49:55.643030 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb815e0a_f0ff_40d4_b1c4_1220b71db056.slice/crio-f41e822b59b448b61365d30b524a162ca93659c5bedce0648e329be8ab17901d WatchSource:0}: Error finding container f41e822b59b448b61365d30b524a162ca93659c5bedce0648e329be8ab17901d: Status 404 returned error can't find the container with id f41e822b59b448b61365d30b524a162ca93659c5bedce0648e329be8ab17901d Dec 06 15:49:55 crc kubenswrapper[4848]: E1206 15:49:55.745780 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified" Dec 06 15:49:55 crc kubenswrapper[4848]: E1206 15:49:55.745933 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k2ptl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-76qm4_openstack(95a42d59-df8c-420d-bb24-c8476a868dd9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 15:49:55 crc kubenswrapper[4848]: E1206 15:49:55.747040 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-76qm4" podUID="95a42d59-df8c-420d-bb24-c8476a868dd9" Dec 06 15:49:55 crc kubenswrapper[4848]: I1206 15:49:55.861735 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-6xvjg" event={"ID":"fb815e0a-f0ff-40d4-b1c4-1220b71db056","Type":"ContainerStarted","Data":"f41e822b59b448b61365d30b524a162ca93659c5bedce0648e329be8ab17901d"} Dec 06 15:49:55 crc kubenswrapper[4848]: E1206 15:49:55.864499 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-76qm4" podUID="95a42d59-df8c-420d-bb24-c8476a868dd9" Dec 06 15:49:58 crc kubenswrapper[4848]: I1206 15:49:58.174895 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6b7d47d5c9-wf778" Dec 06 15:49:58 crc kubenswrapper[4848]: I1206 15:49:58.243203 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-85796f7496-8zdjc"] Dec 06 15:49:58 crc kubenswrapper[4848]: I1206 15:49:58.243463 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-85796f7496-8zdjc" podUID="b51797dd-e2f2-497c-a13c-921ab2868646" containerName="neutron-api" containerID="cri-o://e17acfc291a003894d0b7e0c70eaebbd7a9553f630e0ad2bb75196abadc3b11a" gracePeriod=30 Dec 06 15:49:58 crc kubenswrapper[4848]: I1206 15:49:58.244055 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-85796f7496-8zdjc" podUID="b51797dd-e2f2-497c-a13c-921ab2868646" containerName="neutron-httpd" containerID="cri-o://3b6171b08d63c284cb3b979d9307061c12a1098ffb17f2d08e2487a26fe23f44" gracePeriod=30 Dec 06 15:49:58 crc kubenswrapper[4848]: I1206 15:49:58.886351 4848 generic.go:334] "Generic (PLEG): container finished" podID="b51797dd-e2f2-497c-a13c-921ab2868646" containerID="3b6171b08d63c284cb3b979d9307061c12a1098ffb17f2d08e2487a26fe23f44" exitCode=0 Dec 06 15:49:58 crc kubenswrapper[4848]: I1206 15:49:58.886436 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85796f7496-8zdjc" event={"ID":"b51797dd-e2f2-497c-a13c-921ab2868646","Type":"ContainerDied","Data":"3b6171b08d63c284cb3b979d9307061c12a1098ffb17f2d08e2487a26fe23f44"} Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.382479 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:50:01 crc kubenswrapper[4848]: E1206 15:50:01.413662 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/ironic-python-agent:current-podified" Dec 06 15:50:01 crc kubenswrapper[4848]: E1206 15:50:01.413823 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ironic-python-agent-init,Image:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DEST_DIR,Value:/var/lib/ironic/httpboot,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-merged,ReadOnly:false,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-podinfo,ReadOnly:false,MountPath:/etc/podinfo,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib-ironic,ReadOnly:false,MountPath:/var/lib/ironic,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-custom,ReadOnly:true,MountPath:/var/lib/config-data/custom,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vxw9r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-conductor-0_openstack(ab198686-7839-4e39-abdb-ea9b65893a02): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 15:50:01 crc kubenswrapper[4848]: E1206 15:50:01.415020 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-python-agent-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ironic-conductor-0" podUID="ab198686-7839-4e39-abdb-ea9b65893a02" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.470120 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data\") pod \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.470172 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-scripts\") pod \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.470208 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-merged\") pod \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.470226 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-combined-ca-bundle\") pod \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.470247 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8s2s\" (UniqueName: \"kubernetes.io/projected/cd7ef2f0-4fc0-4e48-a862-7818d1989187-kube-api-access-t8s2s\") pod \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.470370 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-custom\") pod \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.470388 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cd7ef2f0-4fc0-4e48-a862-7818d1989187-etc-podinfo\") pod \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.470402 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-logs\") pod \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\" (UID: \"cd7ef2f0-4fc0-4e48-a862-7818d1989187\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.471903 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-logs" (OuterVolumeSpecName: "logs") pod "cd7ef2f0-4fc0-4e48-a862-7818d1989187" (UID: "cd7ef2f0-4fc0-4e48-a862-7818d1989187"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.487859 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd7ef2f0-4fc0-4e48-a862-7818d1989187-kube-api-access-t8s2s" (OuterVolumeSpecName: "kube-api-access-t8s2s") pod "cd7ef2f0-4fc0-4e48-a862-7818d1989187" (UID: "cd7ef2f0-4fc0-4e48-a862-7818d1989187"). InnerVolumeSpecName "kube-api-access-t8s2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.487955 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "cd7ef2f0-4fc0-4e48-a862-7818d1989187" (UID: "cd7ef2f0-4fc0-4e48-a862-7818d1989187"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.497504 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-scripts" (OuterVolumeSpecName: "scripts") pod "cd7ef2f0-4fc0-4e48-a862-7818d1989187" (UID: "cd7ef2f0-4fc0-4e48-a862-7818d1989187"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.505166 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cd7ef2f0-4fc0-4e48-a862-7818d1989187" (UID: "cd7ef2f0-4fc0-4e48-a862-7818d1989187"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.516029 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cd7ef2f0-4fc0-4e48-a862-7818d1989187-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "cd7ef2f0-4fc0-4e48-a862-7818d1989187" (UID: "cd7ef2f0-4fc0-4e48-a862-7818d1989187"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.559113 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data" (OuterVolumeSpecName: "config-data") pod "cd7ef2f0-4fc0-4e48-a862-7818d1989187" (UID: "cd7ef2f0-4fc0-4e48-a862-7818d1989187"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.573345 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8s2s\" (UniqueName: \"kubernetes.io/projected/cd7ef2f0-4fc0-4e48-a862-7818d1989187-kube-api-access-t8s2s\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.573383 4848 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.573393 4848 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cd7ef2f0-4fc0-4e48-a862-7818d1989187-etc-podinfo\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.573404 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.573415 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.573427 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.573435 4848 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cd7ef2f0-4fc0-4e48-a862-7818d1989187-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.582974 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd7ef2f0-4fc0-4e48-a862-7818d1989187" (UID: "cd7ef2f0-4fc0-4e48-a862-7818d1989187"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.614615 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.673865 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-sg-core-conf-yaml\") pod \"615e4244-19fe-4703-85b3-86086a3e630d\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.673929 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-scripts\") pod \"615e4244-19fe-4703-85b3-86086a3e630d\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.673970 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-log-httpd\") pod \"615e4244-19fe-4703-85b3-86086a3e630d\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.674060 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-config-data\") pod \"615e4244-19fe-4703-85b3-86086a3e630d\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.674124 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb7ds\" (UniqueName: \"kubernetes.io/projected/615e4244-19fe-4703-85b3-86086a3e630d-kube-api-access-hb7ds\") pod \"615e4244-19fe-4703-85b3-86086a3e630d\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.674146 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-run-httpd\") pod \"615e4244-19fe-4703-85b3-86086a3e630d\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.674179 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-combined-ca-bundle\") pod \"615e4244-19fe-4703-85b3-86086a3e630d\" (UID: \"615e4244-19fe-4703-85b3-86086a3e630d\") " Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.674399 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "615e4244-19fe-4703-85b3-86086a3e630d" (UID: "615e4244-19fe-4703-85b3-86086a3e630d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.674548 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7ef2f0-4fc0-4e48-a862-7818d1989187-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.674567 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.674572 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "615e4244-19fe-4703-85b3-86086a3e630d" (UID: "615e4244-19fe-4703-85b3-86086a3e630d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.678280 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-scripts" (OuterVolumeSpecName: "scripts") pod "615e4244-19fe-4703-85b3-86086a3e630d" (UID: "615e4244-19fe-4703-85b3-86086a3e630d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.680926 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615e4244-19fe-4703-85b3-86086a3e630d-kube-api-access-hb7ds" (OuterVolumeSpecName: "kube-api-access-hb7ds") pod "615e4244-19fe-4703-85b3-86086a3e630d" (UID: "615e4244-19fe-4703-85b3-86086a3e630d"). InnerVolumeSpecName "kube-api-access-hb7ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.698034 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "615e4244-19fe-4703-85b3-86086a3e630d" (UID: "615e4244-19fe-4703-85b3-86086a3e630d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.754528 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "615e4244-19fe-4703-85b3-86086a3e630d" (UID: "615e4244-19fe-4703-85b3-86086a3e630d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.772586 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-config-data" (OuterVolumeSpecName: "config-data") pod "615e4244-19fe-4703-85b3-86086a3e630d" (UID: "615e4244-19fe-4703-85b3-86086a3e630d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.775967 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.775999 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb7ds\" (UniqueName: \"kubernetes.io/projected/615e4244-19fe-4703-85b3-86086a3e630d-kube-api-access-hb7ds\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.776015 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/615e4244-19fe-4703-85b3-86086a3e630d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.776027 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.776055 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.776067 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/615e4244-19fe-4703-85b3-86086a3e630d-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.920044 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" event={"ID":"692f44d3-ff17-419f-b16c-b37f71521603","Type":"ContainerStarted","Data":"1650291e67a51ef39e923800f357d666cfab98b55be82b1529fe1aee26f86e94"} Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.920127 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.922784 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7f6646d5b4-tzftd" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.922793 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7f6646d5b4-tzftd" event={"ID":"cd7ef2f0-4fc0-4e48-a862-7818d1989187","Type":"ContainerDied","Data":"a675e2ca175fe626008522b46131978e5052c3fa632fb874e51916d3fb4569ea"} Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.922913 4848 scope.go:117] "RemoveContainer" containerID="064455e5924a3ad2d64a88b28ab13604cb55157d62baa6b923c8e78e74aee3d7" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.936240 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.937383 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"615e4244-19fe-4703-85b3-86086a3e630d","Type":"ContainerDied","Data":"f693dd52d5f3741464a82d5b54e07c7df6630bd9fdafe772d19d32acb58c4aa5"} Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.957925 4848 scope.go:117] "RemoveContainer" containerID="7a6d82e2e9a37041bab589a6f1ad402baac51fcc59f88caf87f6c40a812a2e76" Dec 06 15:50:01 crc kubenswrapper[4848]: E1206 15:50:01.957946 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-python-agent-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/ironic-python-agent:current-podified\\\"\"" pod="openstack/ironic-conductor-0" podUID="ab198686-7839-4e39-abdb-ea9b65893a02" Dec 06 15:50:01 crc kubenswrapper[4848]: I1206 15:50:01.998373 4848 scope.go:117] "RemoveContainer" containerID="dab15829eb23d2cb75e4070b226de1894586dc02b6fe99948bce5a5169556de0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.004364 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-7f6646d5b4-tzftd"] Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.017367 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-7f6646d5b4-tzftd"] Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.033753 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.040553 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.049512 4848 scope.go:117] "RemoveContainer" containerID="fbbe98d9b3d20c0140f2140d8bc2834ea79d8ac45d2a4b9ad623cf034626ca05" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.054743 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:02 crc kubenswrapper[4848]: E1206 15:50:02.055158 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="ironic-api" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055172 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="ironic-api" Dec 06 15:50:02 crc kubenswrapper[4848]: E1206 15:50:02.055184 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="ceilometer-notification-agent" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055193 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="ceilometer-notification-agent" Dec 06 15:50:02 crc kubenswrapper[4848]: E1206 15:50:02.055202 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="ceilometer-central-agent" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055208 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="ceilometer-central-agent" Dec 06 15:50:02 crc kubenswrapper[4848]: E1206 15:50:02.055223 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="sg-core" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055229 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="sg-core" Dec 06 15:50:02 crc kubenswrapper[4848]: E1206 15:50:02.055245 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="proxy-httpd" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055250 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="proxy-httpd" Dec 06 15:50:02 crc kubenswrapper[4848]: E1206 15:50:02.055263 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="ironic-api-log" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055269 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="ironic-api-log" Dec 06 15:50:02 crc kubenswrapper[4848]: E1206 15:50:02.055283 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="init" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055289 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="init" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055462 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="ceilometer-central-agent" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055480 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="ceilometer-notification-agent" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055487 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="ironic-api" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055501 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="proxy-httpd" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055509 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="ironic-api-log" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055520 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="615e4244-19fe-4703-85b3-86086a3e630d" containerName="sg-core" Dec 06 15:50:02 crc kubenswrapper[4848]: E1206 15:50:02.055677 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="ironic-api" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055684 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="ironic-api" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.055878 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" containerName="ironic-api" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.057178 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.058333 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.067819 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.068981 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.106876 4848 scope.go:117] "RemoveContainer" containerID="e0bc9e287c3cc42a6a298e11838fa4bbe6d24d742c003188cec870442966806c" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.184448 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.184514 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-log-httpd\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.184586 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fhdq\" (UniqueName: \"kubernetes.io/projected/54d510cb-8af9-4ac2-b9e4-64e381e879e9-kube-api-access-4fhdq\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.184612 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-scripts\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.184643 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.184667 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-config-data\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.184689 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-run-httpd\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.283183 4848 scope.go:117] "RemoveContainer" containerID="209527aed1edda63466bd05ad7d617d8e7c6b3291c87fca7f9c692014030509b" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.286023 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-log-httpd\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.286131 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fhdq\" (UniqueName: \"kubernetes.io/projected/54d510cb-8af9-4ac2-b9e4-64e381e879e9-kube-api-access-4fhdq\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.286172 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-scripts\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.286221 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.286267 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-config-data\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.286304 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-run-httpd\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.286455 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.286498 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-log-httpd\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.287212 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-run-httpd\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.295541 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-config-data\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.306846 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-scripts\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.306905 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.309588 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fhdq\" (UniqueName: \"kubernetes.io/projected/54d510cb-8af9-4ac2-b9e4-64e381e879e9-kube-api-access-4fhdq\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.326012 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " pod="openstack/ceilometer-0" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.400822 4848 scope.go:117] "RemoveContainer" containerID="d88ad6dd118a52d37f98ccf40f244e7afa3d07000a4d03ae84afeda58aa899e5" Dec 06 15:50:02 crc kubenswrapper[4848]: I1206 15:50:02.430377 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:03 crc kubenswrapper[4848]: I1206 15:50:03.034764 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615e4244-19fe-4703-85b3-86086a3e630d" path="/var/lib/kubelet/pods/615e4244-19fe-4703-85b3-86086a3e630d/volumes" Dec 06 15:50:03 crc kubenswrapper[4848]: I1206 15:50:03.036145 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd7ef2f0-4fc0-4e48-a862-7818d1989187" path="/var/lib/kubelet/pods/cd7ef2f0-4fc0-4e48-a862-7818d1989187/volumes" Dec 06 15:50:03 crc kubenswrapper[4848]: I1206 15:50:03.042182 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:03 crc kubenswrapper[4848]: W1206 15:50:03.501853 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54d510cb_8af9_4ac2_b9e4_64e381e879e9.slice/crio-73a65d3753232b08514d82fd4c1d389a71355508378bdf03e21cbfc1aba8759d WatchSource:0}: Error finding container 73a65d3753232b08514d82fd4c1d389a71355508378bdf03e21cbfc1aba8759d: Status 404 returned error can't find the container with id 73a65d3753232b08514d82fd4c1d389a71355508378bdf03e21cbfc1aba8759d Dec 06 15:50:04 crc kubenswrapper[4848]: I1206 15:50:04.052917 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54d510cb-8af9-4ac2-b9e4-64e381e879e9","Type":"ContainerStarted","Data":"73a65d3753232b08514d82fd4c1d389a71355508378bdf03e21cbfc1aba8759d"} Dec 06 15:50:04 crc kubenswrapper[4848]: I1206 15:50:04.056785 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-6xvjg" event={"ID":"fb815e0a-f0ff-40d4-b1c4-1220b71db056","Type":"ContainerStarted","Data":"90faa7dd031e3d4c899b1bf2dc3d28b8d7dfbeee06464f31acdb8c32962ddf0e"} Dec 06 15:50:04 crc kubenswrapper[4848]: I1206 15:50:04.094836 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-6xvjg" podStartSLOduration=12.175178148 podStartE2EDuration="20.094817649s" podCreationTimestamp="2025-12-06 15:49:44 +0000 UTC" firstStartedPulling="2025-12-06 15:49:55.65227604 +0000 UTC m=+1262.950286963" lastFinishedPulling="2025-12-06 15:50:03.571915551 +0000 UTC m=+1270.869926464" observedRunningTime="2025-12-06 15:50:04.080945723 +0000 UTC m=+1271.378956656" watchObservedRunningTime="2025-12-06 15:50:04.094817649 +0000 UTC m=+1271.392828562" Dec 06 15:50:04 crc kubenswrapper[4848]: I1206 15:50:04.110761 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:05 crc kubenswrapper[4848]: I1206 15:50:05.066968 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54d510cb-8af9-4ac2-b9e4-64e381e879e9","Type":"ContainerStarted","Data":"c7cb439b378981963764b4ec1ba479e5f2b7673eb06c39bf77598e26c4901e31"} Dec 06 15:50:05 crc kubenswrapper[4848]: I1206 15:50:05.067319 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54d510cb-8af9-4ac2-b9e4-64e381e879e9","Type":"ContainerStarted","Data":"58d45486f37e9bf5b900600013358946fd88dd5905001967906a2154b5fed9b3"} Dec 06 15:50:05 crc kubenswrapper[4848]: I1206 15:50:05.069434 4848 generic.go:334] "Generic (PLEG): container finished" podID="692f44d3-ff17-419f-b16c-b37f71521603" containerID="1650291e67a51ef39e923800f357d666cfab98b55be82b1529fe1aee26f86e94" exitCode=1 Dec 06 15:50:05 crc kubenswrapper[4848]: I1206 15:50:05.069464 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" event={"ID":"692f44d3-ff17-419f-b16c-b37f71521603","Type":"ContainerDied","Data":"1650291e67a51ef39e923800f357d666cfab98b55be82b1529fe1aee26f86e94"} Dec 06 15:50:05 crc kubenswrapper[4848]: I1206 15:50:05.069496 4848 scope.go:117] "RemoveContainer" containerID="d5e79f67375339f06f4b4aa90450db97c503c894c63849c7f0779c4e99d5327e" Dec 06 15:50:05 crc kubenswrapper[4848]: I1206 15:50:05.070272 4848 scope.go:117] "RemoveContainer" containerID="1650291e67a51ef39e923800f357d666cfab98b55be82b1529fe1aee26f86e94" Dec 06 15:50:05 crc kubenswrapper[4848]: E1206 15:50:05.070497 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-5f6db98496-rh44f_openstack(692f44d3-ff17-419f-b16c-b37f71521603)\"" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" podUID="692f44d3-ff17-419f-b16c-b37f71521603" Dec 06 15:50:05 crc kubenswrapper[4848]: I1206 15:50:05.237132 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.080354 4848 scope.go:117] "RemoveContainer" containerID="1650291e67a51ef39e923800f357d666cfab98b55be82b1529fe1aee26f86e94" Dec 06 15:50:06 crc kubenswrapper[4848]: E1206 15:50:06.081256 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-5f6db98496-rh44f_openstack(692f44d3-ff17-419f-b16c-b37f71521603)\"" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" podUID="692f44d3-ff17-419f-b16c-b37f71521603" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.082309 4848 generic.go:334] "Generic (PLEG): container finished" podID="fb815e0a-f0ff-40d4-b1c4-1220b71db056" containerID="90faa7dd031e3d4c899b1bf2dc3d28b8d7dfbeee06464f31acdb8c32962ddf0e" exitCode=0 Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.082360 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-6xvjg" event={"ID":"fb815e0a-f0ff-40d4-b1c4-1220b71db056","Type":"ContainerDied","Data":"90faa7dd031e3d4c899b1bf2dc3d28b8d7dfbeee06464f31acdb8c32962ddf0e"} Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.100188 4848 generic.go:334] "Generic (PLEG): container finished" podID="b51797dd-e2f2-497c-a13c-921ab2868646" containerID="e17acfc291a003894d0b7e0c70eaebbd7a9553f630e0ad2bb75196abadc3b11a" exitCode=0 Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.100334 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85796f7496-8zdjc" event={"ID":"b51797dd-e2f2-497c-a13c-921ab2868646","Type":"ContainerDied","Data":"e17acfc291a003894d0b7e0c70eaebbd7a9553f630e0ad2bb75196abadc3b11a"} Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.116114 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54d510cb-8af9-4ac2-b9e4-64e381e879e9","Type":"ContainerStarted","Data":"83fae4717fa4f1e5e01bc4c3a27823bdaeb5f65fb72d882003b86a3f6a631f07"} Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.470544 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.490531 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-combined-ca-bundle\") pod \"b51797dd-e2f2-497c-a13c-921ab2868646\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.490581 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlqsp\" (UniqueName: \"kubernetes.io/projected/b51797dd-e2f2-497c-a13c-921ab2868646-kube-api-access-hlqsp\") pod \"b51797dd-e2f2-497c-a13c-921ab2868646\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.490623 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-httpd-config\") pod \"b51797dd-e2f2-497c-a13c-921ab2868646\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.490663 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-ovndb-tls-certs\") pod \"b51797dd-e2f2-497c-a13c-921ab2868646\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.490708 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-config\") pod \"b51797dd-e2f2-497c-a13c-921ab2868646\" (UID: \"b51797dd-e2f2-497c-a13c-921ab2868646\") " Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.507907 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b51797dd-e2f2-497c-a13c-921ab2868646-kube-api-access-hlqsp" (OuterVolumeSpecName: "kube-api-access-hlqsp") pod "b51797dd-e2f2-497c-a13c-921ab2868646" (UID: "b51797dd-e2f2-497c-a13c-921ab2868646"). InnerVolumeSpecName "kube-api-access-hlqsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.509665 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b51797dd-e2f2-497c-a13c-921ab2868646" (UID: "b51797dd-e2f2-497c-a13c-921ab2868646"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.576296 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-config" (OuterVolumeSpecName: "config") pod "b51797dd-e2f2-497c-a13c-921ab2868646" (UID: "b51797dd-e2f2-497c-a13c-921ab2868646"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.592166 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlqsp\" (UniqueName: \"kubernetes.io/projected/b51797dd-e2f2-497c-a13c-921ab2868646-kube-api-access-hlqsp\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.592199 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.592212 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.609029 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b51797dd-e2f2-497c-a13c-921ab2868646" (UID: "b51797dd-e2f2-497c-a13c-921ab2868646"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.609484 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b51797dd-e2f2-497c-a13c-921ab2868646" (UID: "b51797dd-e2f2-497c-a13c-921ab2868646"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.693778 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:06 crc kubenswrapper[4848]: I1206 15:50:06.693820 4848 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b51797dd-e2f2-497c-a13c-921ab2868646-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.126837 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54d510cb-8af9-4ac2-b9e4-64e381e879e9","Type":"ContainerStarted","Data":"13ac76d4abd02f84b96b1faf3c8cf177d507095f8a3db19630d36e071048761a"} Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.127156 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.127054 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="proxy-httpd" containerID="cri-o://13ac76d4abd02f84b96b1faf3c8cf177d507095f8a3db19630d36e071048761a" gracePeriod=30 Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.127322 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="sg-core" containerID="cri-o://83fae4717fa4f1e5e01bc4c3a27823bdaeb5f65fb72d882003b86a3f6a631f07" gracePeriod=30 Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.127337 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="ceilometer-notification-agent" containerID="cri-o://c7cb439b378981963764b4ec1ba479e5f2b7673eb06c39bf77598e26c4901e31" gracePeriod=30 Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.127429 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="ceilometer-central-agent" containerID="cri-o://58d45486f37e9bf5b900600013358946fd88dd5905001967906a2154b5fed9b3" gracePeriod=30 Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.131358 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85796f7496-8zdjc" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.132244 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85796f7496-8zdjc" event={"ID":"b51797dd-e2f2-497c-a13c-921ab2868646","Type":"ContainerDied","Data":"912768f9da63a04cedac7c4c0911cfb4c8adf844a9390b9fadb0ab11deeebe13"} Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.132308 4848 scope.go:117] "RemoveContainer" containerID="3b6171b08d63c284cb3b979d9307061c12a1098ffb17f2d08e2487a26fe23f44" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.170502 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3553426379999998 podStartE2EDuration="5.170479693s" podCreationTimestamp="2025-12-06 15:50:02 +0000 UTC" firstStartedPulling="2025-12-06 15:50:03.503636094 +0000 UTC m=+1270.801647017" lastFinishedPulling="2025-12-06 15:50:06.318773159 +0000 UTC m=+1273.616784072" observedRunningTime="2025-12-06 15:50:07.154055159 +0000 UTC m=+1274.452066072" watchObservedRunningTime="2025-12-06 15:50:07.170479693 +0000 UTC m=+1274.468490606" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.183781 4848 scope.go:117] "RemoveContainer" containerID="e17acfc291a003894d0b7e0c70eaebbd7a9553f630e0ad2bb75196abadc3b11a" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.184314 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-85796f7496-8zdjc"] Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.193661 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-85796f7496-8zdjc"] Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.449595 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.513674 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic\") pod \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.513789 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-config\") pod \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.513837 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.513857 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-scripts\") pod \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.513883 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjtnn\" (UniqueName: \"kubernetes.io/projected/fb815e0a-f0ff-40d4-b1c4-1220b71db056-kube-api-access-jjtnn\") pod \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.514052 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fb815e0a-f0ff-40d4-b1c4-1220b71db056-etc-podinfo\") pod \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.514080 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "fb815e0a-f0ff-40d4-b1c4-1220b71db056" (UID: "fb815e0a-f0ff-40d4-b1c4-1220b71db056"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.514100 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-combined-ca-bundle\") pod \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\" (UID: \"fb815e0a-f0ff-40d4-b1c4-1220b71db056\") " Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.514597 4848 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.514882 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "fb815e0a-f0ff-40d4-b1c4-1220b71db056" (UID: "fb815e0a-f0ff-40d4-b1c4-1220b71db056"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.519598 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-scripts" (OuterVolumeSpecName: "scripts") pod "fb815e0a-f0ff-40d4-b1c4-1220b71db056" (UID: "fb815e0a-f0ff-40d4-b1c4-1220b71db056"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.520010 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/fb815e0a-f0ff-40d4-b1c4-1220b71db056-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "fb815e0a-f0ff-40d4-b1c4-1220b71db056" (UID: "fb815e0a-f0ff-40d4-b1c4-1220b71db056"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.521068 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb815e0a-f0ff-40d4-b1c4-1220b71db056-kube-api-access-jjtnn" (OuterVolumeSpecName: "kube-api-access-jjtnn") pod "fb815e0a-f0ff-40d4-b1c4-1220b71db056" (UID: "fb815e0a-f0ff-40d4-b1c4-1220b71db056"). InnerVolumeSpecName "kube-api-access-jjtnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.551776 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-config" (OuterVolumeSpecName: "config") pod "fb815e0a-f0ff-40d4-b1c4-1220b71db056" (UID: "fb815e0a-f0ff-40d4-b1c4-1220b71db056"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.553023 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb815e0a-f0ff-40d4-b1c4-1220b71db056" (UID: "fb815e0a-f0ff-40d4-b1c4-1220b71db056"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.616594 4848 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fb815e0a-f0ff-40d4-b1c4-1220b71db056-etc-podinfo\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.616627 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.616637 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.616646 4848 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/fb815e0a-f0ff-40d4-b1c4-1220b71db056-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.616658 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb815e0a-f0ff-40d4-b1c4-1220b71db056-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:07 crc kubenswrapper[4848]: I1206 15:50:07.616668 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjtnn\" (UniqueName: \"kubernetes.io/projected/fb815e0a-f0ff-40d4-b1c4-1220b71db056-kube-api-access-jjtnn\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:08 crc kubenswrapper[4848]: I1206 15:50:08.143338 4848 generic.go:334] "Generic (PLEG): container finished" podID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerID="13ac76d4abd02f84b96b1faf3c8cf177d507095f8a3db19630d36e071048761a" exitCode=0 Dec 06 15:50:08 crc kubenswrapper[4848]: I1206 15:50:08.143643 4848 generic.go:334] "Generic (PLEG): container finished" podID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerID="83fae4717fa4f1e5e01bc4c3a27823bdaeb5f65fb72d882003b86a3f6a631f07" exitCode=2 Dec 06 15:50:08 crc kubenswrapper[4848]: I1206 15:50:08.143653 4848 generic.go:334] "Generic (PLEG): container finished" podID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerID="c7cb439b378981963764b4ec1ba479e5f2b7673eb06c39bf77598e26c4901e31" exitCode=0 Dec 06 15:50:08 crc kubenswrapper[4848]: I1206 15:50:08.143416 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54d510cb-8af9-4ac2-b9e4-64e381e879e9","Type":"ContainerDied","Data":"13ac76d4abd02f84b96b1faf3c8cf177d507095f8a3db19630d36e071048761a"} Dec 06 15:50:08 crc kubenswrapper[4848]: I1206 15:50:08.143740 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54d510cb-8af9-4ac2-b9e4-64e381e879e9","Type":"ContainerDied","Data":"83fae4717fa4f1e5e01bc4c3a27823bdaeb5f65fb72d882003b86a3f6a631f07"} Dec 06 15:50:08 crc kubenswrapper[4848]: I1206 15:50:08.143756 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54d510cb-8af9-4ac2-b9e4-64e381e879e9","Type":"ContainerDied","Data":"c7cb439b378981963764b4ec1ba479e5f2b7673eb06c39bf77598e26c4901e31"} Dec 06 15:50:08 crc kubenswrapper[4848]: I1206 15:50:08.146182 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-6xvjg" event={"ID":"fb815e0a-f0ff-40d4-b1c4-1220b71db056","Type":"ContainerDied","Data":"f41e822b59b448b61365d30b524a162ca93659c5bedce0648e329be8ab17901d"} Dec 06 15:50:08 crc kubenswrapper[4848]: I1206 15:50:08.146218 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f41e822b59b448b61365d30b524a162ca93659c5bedce0648e329be8ab17901d" Dec 06 15:50:08 crc kubenswrapper[4848]: I1206 15:50:08.146313 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-6xvjg" Dec 06 15:50:08 crc kubenswrapper[4848]: I1206 15:50:08.981350 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b51797dd-e2f2-497c-a13c-921ab2868646" path="/var/lib/kubelet/pods/b51797dd-e2f2-497c-a13c-921ab2868646/volumes" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.303467 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Dec 06 15:50:09 crc kubenswrapper[4848]: E1206 15:50:09.304156 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51797dd-e2f2-497c-a13c-921ab2868646" containerName="neutron-httpd" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.304170 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51797dd-e2f2-497c-a13c-921ab2868646" containerName="neutron-httpd" Dec 06 15:50:09 crc kubenswrapper[4848]: E1206 15:50:09.304193 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51797dd-e2f2-497c-a13c-921ab2868646" containerName="neutron-api" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.304199 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51797dd-e2f2-497c-a13c-921ab2868646" containerName="neutron-api" Dec 06 15:50:09 crc kubenswrapper[4848]: E1206 15:50:09.304212 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb815e0a-f0ff-40d4-b1c4-1220b71db056" containerName="ironic-inspector-db-sync" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.304217 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb815e0a-f0ff-40d4-b1c4-1220b71db056" containerName="ironic-inspector-db-sync" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.304406 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b51797dd-e2f2-497c-a13c-921ab2868646" containerName="neutron-api" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.304416 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b51797dd-e2f2-497c-a13c-921ab2868646" containerName="neutron-httpd" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.304431 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb815e0a-f0ff-40d4-b1c4-1220b71db056" containerName="ironic-inspector-db-sync" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.306875 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.311224 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.311368 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.324146 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.344668 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.344799 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-scripts\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.344864 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fm6f\" (UniqueName: \"kubernetes.io/projected/f8eb4579-213d-4f3b-97c2-363344f5aabb-kube-api-access-5fm6f\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.344918 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.344941 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f8eb4579-213d-4f3b-97c2-363344f5aabb-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.344958 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-config\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.344982 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.446263 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f8eb4579-213d-4f3b-97c2-363344f5aabb-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.446322 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-config\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.446358 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.446419 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.446471 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-scripts\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.446550 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fm6f\" (UniqueName: \"kubernetes.io/projected/f8eb4579-213d-4f3b-97c2-363344f5aabb-kube-api-access-5fm6f\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.446615 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.447101 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.447115 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.454871 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f8eb4579-213d-4f3b-97c2-363344f5aabb-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.454902 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.458449 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-scripts\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.459741 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-config\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.472329 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fm6f\" (UniqueName: \"kubernetes.io/projected/f8eb4579-213d-4f3b-97c2-363344f5aabb-kube-api-access-5fm6f\") pod \"ironic-inspector-0\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:09 crc kubenswrapper[4848]: I1206 15:50:09.623772 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 06 15:50:10 crc kubenswrapper[4848]: I1206 15:50:10.129251 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Dec 06 15:50:10 crc kubenswrapper[4848]: I1206 15:50:10.165059 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"f8eb4579-213d-4f3b-97c2-363344f5aabb","Type":"ContainerStarted","Data":"4592be7f7da82551debd92e5bfc4956d026e800febbe2911654df40a479ab190"} Dec 06 15:50:11 crc kubenswrapper[4848]: I1206 15:50:11.177163 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-76qm4" event={"ID":"95a42d59-df8c-420d-bb24-c8476a868dd9","Type":"ContainerStarted","Data":"0b142f17068d8fdee513880326d48298aecb4fa3b2b3502488efe1e6f6753bf1"} Dec 06 15:50:11 crc kubenswrapper[4848]: I1206 15:50:11.179890 4848 generic.go:334] "Generic (PLEG): container finished" podID="f8eb4579-213d-4f3b-97c2-363344f5aabb" containerID="2049da01f427191d7cca706d34813e6abf8a70af0679895cf0ccb9538987a0a0" exitCode=0 Dec 06 15:50:11 crc kubenswrapper[4848]: I1206 15:50:11.179943 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"f8eb4579-213d-4f3b-97c2-363344f5aabb","Type":"ContainerDied","Data":"2049da01f427191d7cca706d34813e6abf8a70af0679895cf0ccb9538987a0a0"} Dec 06 15:50:11 crc kubenswrapper[4848]: I1206 15:50:11.203652 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-76qm4" podStartSLOduration=4.056518268 podStartE2EDuration="38.203624863s" podCreationTimestamp="2025-12-06 15:49:33 +0000 UTC" firstStartedPulling="2025-12-06 15:49:36.259911115 +0000 UTC m=+1243.557922028" lastFinishedPulling="2025-12-06 15:50:10.40701771 +0000 UTC m=+1277.705028623" observedRunningTime="2025-12-06 15:50:11.198118874 +0000 UTC m=+1278.496129787" watchObservedRunningTime="2025-12-06 15:50:11.203624863 +0000 UTC m=+1278.501635786" Dec 06 15:50:11 crc kubenswrapper[4848]: I1206 15:50:11.800554 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Dec 06 15:50:13 crc kubenswrapper[4848]: I1206 15:50:13.208625 4848 generic.go:334] "Generic (PLEG): container finished" podID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerID="58d45486f37e9bf5b900600013358946fd88dd5905001967906a2154b5fed9b3" exitCode=0 Dec 06 15:50:13 crc kubenswrapper[4848]: I1206 15:50:13.208706 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54d510cb-8af9-4ac2-b9e4-64e381e879e9","Type":"ContainerDied","Data":"58d45486f37e9bf5b900600013358946fd88dd5905001967906a2154b5fed9b3"} Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.000864 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.055348 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-scripts\") pod \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.055409 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-sg-core-conf-yaml\") pod \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.055564 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-log-httpd\") pod \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.055609 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-run-httpd\") pod \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.055631 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-config-data\") pod \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.055665 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-combined-ca-bundle\") pod \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.055690 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fhdq\" (UniqueName: \"kubernetes.io/projected/54d510cb-8af9-4ac2-b9e4-64e381e879e9-kube-api-access-4fhdq\") pod \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\" (UID: \"54d510cb-8af9-4ac2-b9e4-64e381e879e9\") " Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.058867 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "54d510cb-8af9-4ac2-b9e4-64e381e879e9" (UID: "54d510cb-8af9-4ac2-b9e4-64e381e879e9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.062711 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "54d510cb-8af9-4ac2-b9e4-64e381e879e9" (UID: "54d510cb-8af9-4ac2-b9e4-64e381e879e9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.064154 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-scripts" (OuterVolumeSpecName: "scripts") pod "54d510cb-8af9-4ac2-b9e4-64e381e879e9" (UID: "54d510cb-8af9-4ac2-b9e4-64e381e879e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.080463 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d510cb-8af9-4ac2-b9e4-64e381e879e9-kube-api-access-4fhdq" (OuterVolumeSpecName: "kube-api-access-4fhdq") pod "54d510cb-8af9-4ac2-b9e4-64e381e879e9" (UID: "54d510cb-8af9-4ac2-b9e4-64e381e879e9"). InnerVolumeSpecName "kube-api-access-4fhdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.113454 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "54d510cb-8af9-4ac2-b9e4-64e381e879e9" (UID: "54d510cb-8af9-4ac2-b9e4-64e381e879e9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.154884 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54d510cb-8af9-4ac2-b9e4-64e381e879e9" (UID: "54d510cb-8af9-4ac2-b9e4-64e381e879e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.157870 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.157903 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54d510cb-8af9-4ac2-b9e4-64e381e879e9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.157917 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.157931 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fhdq\" (UniqueName: \"kubernetes.io/projected/54d510cb-8af9-4ac2-b9e4-64e381e879e9-kube-api-access-4fhdq\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.157946 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.157958 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.188574 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-config-data" (OuterVolumeSpecName: "config-data") pod "54d510cb-8af9-4ac2-b9e4-64e381e879e9" (UID: "54d510cb-8af9-4ac2-b9e4-64e381e879e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.220783 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54d510cb-8af9-4ac2-b9e4-64e381e879e9","Type":"ContainerDied","Data":"73a65d3753232b08514d82fd4c1d389a71355508378bdf03e21cbfc1aba8759d"} Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.220833 4848 scope.go:117] "RemoveContainer" containerID="13ac76d4abd02f84b96b1faf3c8cf177d507095f8a3db19630d36e071048761a" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.220955 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.259955 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d510cb-8af9-4ac2-b9e4-64e381e879e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.294474 4848 scope.go:117] "RemoveContainer" containerID="83fae4717fa4f1e5e01bc4c3a27823bdaeb5f65fb72d882003b86a3f6a631f07" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.296260 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.307726 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.327942 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:14 crc kubenswrapper[4848]: E1206 15:50:14.328456 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="ceilometer-central-agent" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.328479 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="ceilometer-central-agent" Dec 06 15:50:14 crc kubenswrapper[4848]: E1206 15:50:14.328496 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="ceilometer-notification-agent" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.328503 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="ceilometer-notification-agent" Dec 06 15:50:14 crc kubenswrapper[4848]: E1206 15:50:14.328531 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="proxy-httpd" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.328539 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="proxy-httpd" Dec 06 15:50:14 crc kubenswrapper[4848]: E1206 15:50:14.328559 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="sg-core" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.328565 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="sg-core" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.328807 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="proxy-httpd" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.328828 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="ceilometer-notification-agent" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.328842 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="ceilometer-central-agent" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.328856 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" containerName="sg-core" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.330502 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.332785 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.332924 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.346121 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.348873 4848 scope.go:117] "RemoveContainer" containerID="c7cb439b378981963764b4ec1ba479e5f2b7673eb06c39bf77598e26c4901e31" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.361924 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7srb\" (UniqueName: \"kubernetes.io/projected/107f62d3-cd13-45b0-8010-9e90340e069c-kube-api-access-f7srb\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.361969 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-scripts\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.362049 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-log-httpd\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.362115 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-config-data\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.362162 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-run-httpd\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.362182 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.362206 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.373527 4848 scope.go:117] "RemoveContainer" containerID="58d45486f37e9bf5b900600013358946fd88dd5905001967906a2154b5fed9b3" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.463396 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-run-httpd\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.463441 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.463460 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.463540 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7srb\" (UniqueName: \"kubernetes.io/projected/107f62d3-cd13-45b0-8010-9e90340e069c-kube-api-access-f7srb\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.463564 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-scripts\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.463588 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-log-httpd\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.463868 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-run-httpd\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.463898 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-config-data\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.464178 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-log-httpd\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.467836 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-scripts\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.467996 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.478170 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7srb\" (UniqueName: \"kubernetes.io/projected/107f62d3-cd13-45b0-8010-9e90340e069c-kube-api-access-f7srb\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.478654 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.485429 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-config-data\") pod \"ceilometer-0\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.650460 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:14 crc kubenswrapper[4848]: I1206 15:50:14.980754 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d510cb-8af9-4ac2-b9e4-64e381e879e9" path="/var/lib/kubelet/pods/54d510cb-8af9-4ac2-b9e4-64e381e879e9/volumes" Dec 06 15:50:16 crc kubenswrapper[4848]: I1206 15:50:16.968831 4848 scope.go:117] "RemoveContainer" containerID="1650291e67a51ef39e923800f357d666cfab98b55be82b1529fe1aee26f86e94" Dec 06 15:50:17 crc kubenswrapper[4848]: W1206 15:50:17.120192 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod107f62d3_cd13_45b0_8010_9e90340e069c.slice/crio-7fd48576ae0203fc5ee0bd06d5df90b95d3f8c894a57eb79b684faf847f06f69 WatchSource:0}: Error finding container 7fd48576ae0203fc5ee0bd06d5df90b95d3f8c894a57eb79b684faf847f06f69: Status 404 returned error can't find the container with id 7fd48576ae0203fc5ee0bd06d5df90b95d3f8c894a57eb79b684faf847f06f69 Dec 06 15:50:17 crc kubenswrapper[4848]: I1206 15:50:17.121632 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:17 crc kubenswrapper[4848]: I1206 15:50:17.150133 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:50:17 crc kubenswrapper[4848]: I1206 15:50:17.150207 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:50:17 crc kubenswrapper[4848]: I1206 15:50:17.253561 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" event={"ID":"692f44d3-ff17-419f-b16c-b37f71521603","Type":"ContainerStarted","Data":"e20701f9e5b7d592c3491af1e6980bf5e4039994447b39b48062a00003fc29ab"} Dec 06 15:50:17 crc kubenswrapper[4848]: I1206 15:50:17.254067 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:50:17 crc kubenswrapper[4848]: I1206 15:50:17.256263 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"ab198686-7839-4e39-abdb-ea9b65893a02","Type":"ContainerStarted","Data":"e62862f64f3ddd8a8b58a5cd993b407eed99356379bdd77f416444c5d03ff513"} Dec 06 15:50:17 crc kubenswrapper[4848]: I1206 15:50:17.257360 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"107f62d3-cd13-45b0-8010-9e90340e069c","Type":"ContainerStarted","Data":"7fd48576ae0203fc5ee0bd06d5df90b95d3f8c894a57eb79b684faf847f06f69"} Dec 06 15:50:17 crc kubenswrapper[4848]: I1206 15:50:17.260143 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"f8eb4579-213d-4f3b-97c2-363344f5aabb","Type":"ContainerStarted","Data":"c4e9f98589f7324749d064e071dcb3f6ee1366f04934b6deaf3276a85bc26679"} Dec 06 15:50:17 crc kubenswrapper[4848]: I1206 15:50:17.260344 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-inspector-0" podUID="f8eb4579-213d-4f3b-97c2-363344f5aabb" containerName="inspector-pxe-init" containerID="cri-o://c4e9f98589f7324749d064e071dcb3f6ee1366f04934b6deaf3276a85bc26679" gracePeriod=60 Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.272477 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-5f6db98496-rh44f" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.295314 4848 generic.go:334] "Generic (PLEG): container finished" podID="f8eb4579-213d-4f3b-97c2-363344f5aabb" containerID="c4e9f98589f7324749d064e071dcb3f6ee1366f04934b6deaf3276a85bc26679" exitCode=0 Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.295372 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"f8eb4579-213d-4f3b-97c2-363344f5aabb","Type":"ContainerDied","Data":"c4e9f98589f7324749d064e071dcb3f6ee1366f04934b6deaf3276a85bc26679"} Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.543586 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.591333 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic\") pod \"f8eb4579-213d-4f3b-97c2-363344f5aabb\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.591451 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-config\") pod \"f8eb4579-213d-4f3b-97c2-363344f5aabb\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.591479 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fm6f\" (UniqueName: \"kubernetes.io/projected/f8eb4579-213d-4f3b-97c2-363344f5aabb-kube-api-access-5fm6f\") pod \"f8eb4579-213d-4f3b-97c2-363344f5aabb\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.591502 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"f8eb4579-213d-4f3b-97c2-363344f5aabb\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.591541 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-scripts\") pod \"f8eb4579-213d-4f3b-97c2-363344f5aabb\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.591617 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-combined-ca-bundle\") pod \"f8eb4579-213d-4f3b-97c2-363344f5aabb\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.591688 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f8eb4579-213d-4f3b-97c2-363344f5aabb-etc-podinfo\") pod \"f8eb4579-213d-4f3b-97c2-363344f5aabb\" (UID: \"f8eb4579-213d-4f3b-97c2-363344f5aabb\") " Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.592309 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "f8eb4579-213d-4f3b-97c2-363344f5aabb" (UID: "f8eb4579-213d-4f3b-97c2-363344f5aabb"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.605356 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8eb4579-213d-4f3b-97c2-363344f5aabb-kube-api-access-5fm6f" (OuterVolumeSpecName: "kube-api-access-5fm6f") pod "f8eb4579-213d-4f3b-97c2-363344f5aabb" (UID: "f8eb4579-213d-4f3b-97c2-363344f5aabb"). InnerVolumeSpecName "kube-api-access-5fm6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.608594 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f8eb4579-213d-4f3b-97c2-363344f5aabb-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "f8eb4579-213d-4f3b-97c2-363344f5aabb" (UID: "f8eb4579-213d-4f3b-97c2-363344f5aabb"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.609074 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-scripts" (OuterVolumeSpecName: "scripts") pod "f8eb4579-213d-4f3b-97c2-363344f5aabb" (UID: "f8eb4579-213d-4f3b-97c2-363344f5aabb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.609159 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-config" (OuterVolumeSpecName: "config") pod "f8eb4579-213d-4f3b-97c2-363344f5aabb" (UID: "f8eb4579-213d-4f3b-97c2-363344f5aabb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.613992 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "f8eb4579-213d-4f3b-97c2-363344f5aabb" (UID: "f8eb4579-213d-4f3b-97c2-363344f5aabb"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.651727 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8eb4579-213d-4f3b-97c2-363344f5aabb" (UID: "f8eb4579-213d-4f3b-97c2-363344f5aabb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.693530 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.693563 4848 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f8eb4579-213d-4f3b-97c2-363344f5aabb-etc-podinfo\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.693576 4848 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.693588 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.693599 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fm6f\" (UniqueName: \"kubernetes.io/projected/f8eb4579-213d-4f3b-97c2-363344f5aabb-kube-api-access-5fm6f\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.693612 4848 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/f8eb4579-213d-4f3b-97c2-363344f5aabb-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:20 crc kubenswrapper[4848]: I1206 15:50:20.693624 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8eb4579-213d-4f3b-97c2-363344f5aabb-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.318411 4848 generic.go:334] "Generic (PLEG): container finished" podID="ab198686-7839-4e39-abdb-ea9b65893a02" containerID="e62862f64f3ddd8a8b58a5cd993b407eed99356379bdd77f416444c5d03ff513" exitCode=0 Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.318448 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"ab198686-7839-4e39-abdb-ea9b65893a02","Type":"ContainerDied","Data":"e62862f64f3ddd8a8b58a5cd993b407eed99356379bdd77f416444c5d03ff513"} Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.326164 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"f8eb4579-213d-4f3b-97c2-363344f5aabb","Type":"ContainerDied","Data":"4592be7f7da82551debd92e5bfc4956d026e800febbe2911654df40a479ab190"} Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.326217 4848 scope.go:117] "RemoveContainer" containerID="c4e9f98589f7324749d064e071dcb3f6ee1366f04934b6deaf3276a85bc26679" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.326384 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.355082 4848 scope.go:117] "RemoveContainer" containerID="2049da01f427191d7cca706d34813e6abf8a70af0679895cf0ccb9538987a0a0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.410436 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.421204 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.450970 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Dec 06 15:50:21 crc kubenswrapper[4848]: E1206 15:50:21.451653 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8eb4579-213d-4f3b-97c2-363344f5aabb" containerName="ironic-python-agent-init" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.451675 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8eb4579-213d-4f3b-97c2-363344f5aabb" containerName="ironic-python-agent-init" Dec 06 15:50:21 crc kubenswrapper[4848]: E1206 15:50:21.451725 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8eb4579-213d-4f3b-97c2-363344f5aabb" containerName="inspector-pxe-init" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.451734 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8eb4579-213d-4f3b-97c2-363344f5aabb" containerName="inspector-pxe-init" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.451981 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8eb4579-213d-4f3b-97c2-363344f5aabb" containerName="inspector-pxe-init" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.455589 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.459496 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.459718 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.460063 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.465719 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.513732 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-config\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.513814 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.513836 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/3e8a616c-5de8-4037-86f3-1d4e891947f6-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.513856 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3e8a616c-5de8-4037-86f3-1d4e891947f6-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.513878 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbq6x\" (UniqueName: \"kubernetes.io/projected/3e8a616c-5de8-4037-86f3-1d4e891947f6-kube-api-access-kbq6x\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.513898 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.513920 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/3e8a616c-5de8-4037-86f3-1d4e891947f6-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.513937 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.513970 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-scripts\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.574310 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.616502 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-config\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.616604 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/3e8a616c-5de8-4037-86f3-1d4e891947f6-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.616673 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.616724 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3e8a616c-5de8-4037-86f3-1d4e891947f6-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.616752 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.616770 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbq6x\" (UniqueName: \"kubernetes.io/projected/3e8a616c-5de8-4037-86f3-1d4e891947f6-kube-api-access-kbq6x\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.616820 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/3e8a616c-5de8-4037-86f3-1d4e891947f6-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.616837 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.616897 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-scripts\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.617652 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/3e8a616c-5de8-4037-86f3-1d4e891947f6-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.617809 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/3e8a616c-5de8-4037-86f3-1d4e891947f6-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.621205 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-scripts\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.622211 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.624314 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3e8a616c-5de8-4037-86f3-1d4e891947f6-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.633251 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-config\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.634434 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbq6x\" (UniqueName: \"kubernetes.io/projected/3e8a616c-5de8-4037-86f3-1d4e891947f6-kube-api-access-kbq6x\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.635743 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.639241 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8a616c-5de8-4037-86f3-1d4e891947f6-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3e8a616c-5de8-4037-86f3-1d4e891947f6\") " pod="openstack/ironic-inspector-0" Dec 06 15:50:21 crc kubenswrapper[4848]: I1206 15:50:21.886560 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 06 15:50:22 crc kubenswrapper[4848]: W1206 15:50:22.350179 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e8a616c_5de8_4037_86f3_1d4e891947f6.slice/crio-1c30f5f7c5d1caad17a7e3f84fb9371ba0167610ef11b2ae70461678a2ac62a9 WatchSource:0}: Error finding container 1c30f5f7c5d1caad17a7e3f84fb9371ba0167610ef11b2ae70461678a2ac62a9: Status 404 returned error can't find the container with id 1c30f5f7c5d1caad17a7e3f84fb9371ba0167610ef11b2ae70461678a2ac62a9 Dec 06 15:50:22 crc kubenswrapper[4848]: I1206 15:50:22.352630 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Dec 06 15:50:22 crc kubenswrapper[4848]: I1206 15:50:22.992903 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8eb4579-213d-4f3b-97c2-363344f5aabb" path="/var/lib/kubelet/pods/f8eb4579-213d-4f3b-97c2-363344f5aabb/volumes" Dec 06 15:50:23 crc kubenswrapper[4848]: I1206 15:50:23.019155 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:23 crc kubenswrapper[4848]: I1206 15:50:23.356526 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"ab198686-7839-4e39-abdb-ea9b65893a02","Type":"ContainerStarted","Data":"b0da0f6e4da0be9213843fe02b4ec52ec72e4fffb1cb7b2e734d96ea462bd20b"} Dec 06 15:50:23 crc kubenswrapper[4848]: I1206 15:50:23.358920 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"107f62d3-cd13-45b0-8010-9e90340e069c","Type":"ContainerStarted","Data":"6f13c743a2180f40e1a49d5f44ea7ce80ad7ee7e36ff610088df0e9b95eaf9f8"} Dec 06 15:50:23 crc kubenswrapper[4848]: I1206 15:50:23.360852 4848 generic.go:334] "Generic (PLEG): container finished" podID="3e8a616c-5de8-4037-86f3-1d4e891947f6" containerID="edc07e74f6b40d8106c5b447d8242524bfba71daa87bc591d74eb7ac51953103" exitCode=0 Dec 06 15:50:23 crc kubenswrapper[4848]: I1206 15:50:23.360940 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3e8a616c-5de8-4037-86f3-1d4e891947f6","Type":"ContainerDied","Data":"edc07e74f6b40d8106c5b447d8242524bfba71daa87bc591d74eb7ac51953103"} Dec 06 15:50:23 crc kubenswrapper[4848]: I1206 15:50:23.361169 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3e8a616c-5de8-4037-86f3-1d4e891947f6","Type":"ContainerStarted","Data":"1c30f5f7c5d1caad17a7e3f84fb9371ba0167610ef11b2ae70461678a2ac62a9"} Dec 06 15:50:24 crc kubenswrapper[4848]: I1206 15:50:24.373287 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"107f62d3-cd13-45b0-8010-9e90340e069c","Type":"ContainerStarted","Data":"f0bd17fc5fff6fbc36d309306f7a3322da51752d450b781d6f333bf485ad0622"} Dec 06 15:50:25 crc kubenswrapper[4848]: I1206 15:50:25.383553 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3e8a616c-5de8-4037-86f3-1d4e891947f6","Type":"ContainerStarted","Data":"50b0ca92ab3212813d4b53d5459f01e0636c7a84fc4bd2c42d71145871fc43ab"} Dec 06 15:50:29 crc kubenswrapper[4848]: I1206 15:50:29.420732 4848 generic.go:334] "Generic (PLEG): container finished" podID="3e8a616c-5de8-4037-86f3-1d4e891947f6" containerID="50b0ca92ab3212813d4b53d5459f01e0636c7a84fc4bd2c42d71145871fc43ab" exitCode=0 Dec 06 15:50:29 crc kubenswrapper[4848]: I1206 15:50:29.420811 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3e8a616c-5de8-4037-86f3-1d4e891947f6","Type":"ContainerDied","Data":"50b0ca92ab3212813d4b53d5459f01e0636c7a84fc4bd2c42d71145871fc43ab"} Dec 06 15:50:29 crc kubenswrapper[4848]: I1206 15:50:29.425025 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"107f62d3-cd13-45b0-8010-9e90340e069c","Type":"ContainerStarted","Data":"9d1d57a37761ac67516f22903e41f834ac780f2c55d08fa802cd7b1ffc62cb8a"} Dec 06 15:50:31 crc kubenswrapper[4848]: I1206 15:50:31.442543 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3e8a616c-5de8-4037-86f3-1d4e891947f6","Type":"ContainerStarted","Data":"6c78941c3be43fa62e70d812a30595a7b57fdf2f8597a6041148ea36a6885d9f"} Dec 06 15:50:32 crc kubenswrapper[4848]: I1206 15:50:32.452464 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"107f62d3-cd13-45b0-8010-9e90340e069c","Type":"ContainerStarted","Data":"2ef20e63dcab8f6740f9d094f3365e25e2f11efd04e6d3e2535b91156ca7d0b6"} Dec 06 15:50:33 crc kubenswrapper[4848]: I1206 15:50:33.475015 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3e8a616c-5de8-4037-86f3-1d4e891947f6","Type":"ContainerStarted","Data":"6846c838ffc0ba0ffe4b4865afc16320413eaec63fc57e02d4d30ed25a2ad4ea"} Dec 06 15:50:33 crc kubenswrapper[4848]: I1206 15:50:33.475766 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3e8a616c-5de8-4037-86f3-1d4e891947f6","Type":"ContainerStarted","Data":"152500a1903cdcac57396847380637784338a97bcd0f9955e718982a1c40962a"} Dec 06 15:50:33 crc kubenswrapper[4848]: I1206 15:50:33.475793 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 15:50:33 crc kubenswrapper[4848]: I1206 15:50:33.475230 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="ceilometer-notification-agent" containerID="cri-o://f0bd17fc5fff6fbc36d309306f7a3322da51752d450b781d6f333bf485ad0622" gracePeriod=30 Dec 06 15:50:33 crc kubenswrapper[4848]: I1206 15:50:33.475172 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="ceilometer-central-agent" containerID="cri-o://6f13c743a2180f40e1a49d5f44ea7ce80ad7ee7e36ff610088df0e9b95eaf9f8" gracePeriod=30 Dec 06 15:50:33 crc kubenswrapper[4848]: I1206 15:50:33.475184 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="proxy-httpd" containerID="cri-o://2ef20e63dcab8f6740f9d094f3365e25e2f11efd04e6d3e2535b91156ca7d0b6" gracePeriod=30 Dec 06 15:50:33 crc kubenswrapper[4848]: I1206 15:50:33.475252 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="sg-core" containerID="cri-o://9d1d57a37761ac67516f22903e41f834ac780f2c55d08fa802cd7b1ffc62cb8a" gracePeriod=30 Dec 06 15:50:33 crc kubenswrapper[4848]: I1206 15:50:33.506992 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.617363627 podStartE2EDuration="19.506972723s" podCreationTimestamp="2025-12-06 15:50:14 +0000 UTC" firstStartedPulling="2025-12-06 15:50:17.123073728 +0000 UTC m=+1284.421084631" lastFinishedPulling="2025-12-06 15:50:32.012682814 +0000 UTC m=+1299.310693727" observedRunningTime="2025-12-06 15:50:33.499577474 +0000 UTC m=+1300.797588387" watchObservedRunningTime="2025-12-06 15:50:33.506972723 +0000 UTC m=+1300.804983636" Dec 06 15:50:34 crc kubenswrapper[4848]: I1206 15:50:34.488419 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3e8a616c-5de8-4037-86f3-1d4e891947f6","Type":"ContainerStarted","Data":"6bcb73e98476e2a070a6bb135302421c50869c7e00f8d6fa0ab7f2d31054caca"} Dec 06 15:50:34 crc kubenswrapper[4848]: I1206 15:50:34.490115 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Dec 06 15:50:34 crc kubenswrapper[4848]: I1206 15:50:34.492607 4848 generic.go:334] "Generic (PLEG): container finished" podID="107f62d3-cd13-45b0-8010-9e90340e069c" containerID="2ef20e63dcab8f6740f9d094f3365e25e2f11efd04e6d3e2535b91156ca7d0b6" exitCode=0 Dec 06 15:50:34 crc kubenswrapper[4848]: I1206 15:50:34.492633 4848 generic.go:334] "Generic (PLEG): container finished" podID="107f62d3-cd13-45b0-8010-9e90340e069c" containerID="9d1d57a37761ac67516f22903e41f834ac780f2c55d08fa802cd7b1ffc62cb8a" exitCode=2 Dec 06 15:50:34 crc kubenswrapper[4848]: I1206 15:50:34.492640 4848 generic.go:334] "Generic (PLEG): container finished" podID="107f62d3-cd13-45b0-8010-9e90340e069c" containerID="f0bd17fc5fff6fbc36d309306f7a3322da51752d450b781d6f333bf485ad0622" exitCode=0 Dec 06 15:50:34 crc kubenswrapper[4848]: I1206 15:50:34.492655 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"107f62d3-cd13-45b0-8010-9e90340e069c","Type":"ContainerDied","Data":"2ef20e63dcab8f6740f9d094f3365e25e2f11efd04e6d3e2535b91156ca7d0b6"} Dec 06 15:50:34 crc kubenswrapper[4848]: I1206 15:50:34.492672 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"107f62d3-cd13-45b0-8010-9e90340e069c","Type":"ContainerDied","Data":"9d1d57a37761ac67516f22903e41f834ac780f2c55d08fa802cd7b1ffc62cb8a"} Dec 06 15:50:34 crc kubenswrapper[4848]: I1206 15:50:34.492681 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"107f62d3-cd13-45b0-8010-9e90340e069c","Type":"ContainerDied","Data":"f0bd17fc5fff6fbc36d309306f7a3322da51752d450b781d6f333bf485ad0622"} Dec 06 15:50:34 crc kubenswrapper[4848]: I1206 15:50:34.529869 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=13.529850536 podStartE2EDuration="13.529850536s" podCreationTimestamp="2025-12-06 15:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:50:34.518647103 +0000 UTC m=+1301.816658016" watchObservedRunningTime="2025-12-06 15:50:34.529850536 +0000 UTC m=+1301.827861449" Dec 06 15:50:36 crc kubenswrapper[4848]: I1206 15:50:36.511783 4848 generic.go:334] "Generic (PLEG): container finished" podID="107f62d3-cd13-45b0-8010-9e90340e069c" containerID="6f13c743a2180f40e1a49d5f44ea7ce80ad7ee7e36ff610088df0e9b95eaf9f8" exitCode=0 Dec 06 15:50:36 crc kubenswrapper[4848]: I1206 15:50:36.511825 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"107f62d3-cd13-45b0-8010-9e90340e069c","Type":"ContainerDied","Data":"6f13c743a2180f40e1a49d5f44ea7ce80ad7ee7e36ff610088df0e9b95eaf9f8"} Dec 06 15:50:36 crc kubenswrapper[4848]: I1206 15:50:36.887338 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Dec 06 15:50:36 crc kubenswrapper[4848]: I1206 15:50:36.887757 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.205094 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.323690 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-config-data\") pod \"107f62d3-cd13-45b0-8010-9e90340e069c\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.323808 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-run-httpd\") pod \"107f62d3-cd13-45b0-8010-9e90340e069c\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.323993 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-combined-ca-bundle\") pod \"107f62d3-cd13-45b0-8010-9e90340e069c\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.324039 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7srb\" (UniqueName: \"kubernetes.io/projected/107f62d3-cd13-45b0-8010-9e90340e069c-kube-api-access-f7srb\") pod \"107f62d3-cd13-45b0-8010-9e90340e069c\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.324074 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-sg-core-conf-yaml\") pod \"107f62d3-cd13-45b0-8010-9e90340e069c\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.324210 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-log-httpd\") pod \"107f62d3-cd13-45b0-8010-9e90340e069c\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.324410 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-scripts\") pod \"107f62d3-cd13-45b0-8010-9e90340e069c\" (UID: \"107f62d3-cd13-45b0-8010-9e90340e069c\") " Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.327067 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "107f62d3-cd13-45b0-8010-9e90340e069c" (UID: "107f62d3-cd13-45b0-8010-9e90340e069c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.327418 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "107f62d3-cd13-45b0-8010-9e90340e069c" (UID: "107f62d3-cd13-45b0-8010-9e90340e069c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.336229 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-scripts" (OuterVolumeSpecName: "scripts") pod "107f62d3-cd13-45b0-8010-9e90340e069c" (UID: "107f62d3-cd13-45b0-8010-9e90340e069c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.340631 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107f62d3-cd13-45b0-8010-9e90340e069c-kube-api-access-f7srb" (OuterVolumeSpecName: "kube-api-access-f7srb") pod "107f62d3-cd13-45b0-8010-9e90340e069c" (UID: "107f62d3-cd13-45b0-8010-9e90340e069c"). InnerVolumeSpecName "kube-api-access-f7srb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.364773 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "107f62d3-cd13-45b0-8010-9e90340e069c" (UID: "107f62d3-cd13-45b0-8010-9e90340e069c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.427494 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.427545 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.427557 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107f62d3-cd13-45b0-8010-9e90340e069c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.427570 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7srb\" (UniqueName: \"kubernetes.io/projected/107f62d3-cd13-45b0-8010-9e90340e069c-kube-api-access-f7srb\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.427588 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.442324 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-config-data" (OuterVolumeSpecName: "config-data") pod "107f62d3-cd13-45b0-8010-9e90340e069c" (UID: "107f62d3-cd13-45b0-8010-9e90340e069c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.444048 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "107f62d3-cd13-45b0-8010-9e90340e069c" (UID: "107f62d3-cd13-45b0-8010-9e90340e069c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.523146 4848 generic.go:334] "Generic (PLEG): container finished" podID="95a42d59-df8c-420d-bb24-c8476a868dd9" containerID="0b142f17068d8fdee513880326d48298aecb4fa3b2b3502488efe1e6f6753bf1" exitCode=0 Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.523224 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-76qm4" event={"ID":"95a42d59-df8c-420d-bb24-c8476a868dd9","Type":"ContainerDied","Data":"0b142f17068d8fdee513880326d48298aecb4fa3b2b3502488efe1e6f6753bf1"} Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.526481 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"107f62d3-cd13-45b0-8010-9e90340e069c","Type":"ContainerDied","Data":"7fd48576ae0203fc5ee0bd06d5df90b95d3f8c894a57eb79b684faf847f06f69"} Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.526555 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.526565 4848 scope.go:117] "RemoveContainer" containerID="2ef20e63dcab8f6740f9d094f3365e25e2f11efd04e6d3e2535b91156ca7d0b6" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.528790 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.528819 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107f62d3-cd13-45b0-8010-9e90340e069c-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.545578 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.574142 4848 scope.go:117] "RemoveContainer" containerID="9d1d57a37761ac67516f22903e41f834ac780f2c55d08fa802cd7b1ffc62cb8a" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.602338 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.608934 4848 scope.go:117] "RemoveContainer" containerID="f0bd17fc5fff6fbc36d309306f7a3322da51752d450b781d6f333bf485ad0622" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.609886 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.627360 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:37 crc kubenswrapper[4848]: E1206 15:50:37.628085 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="ceilometer-central-agent" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.628172 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="ceilometer-central-agent" Dec 06 15:50:37 crc kubenswrapper[4848]: E1206 15:50:37.628254 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="proxy-httpd" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.628322 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="proxy-httpd" Dec 06 15:50:37 crc kubenswrapper[4848]: E1206 15:50:37.628401 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="ceilometer-notification-agent" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.628462 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="ceilometer-notification-agent" Dec 06 15:50:37 crc kubenswrapper[4848]: E1206 15:50:37.628530 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="sg-core" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.628831 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="sg-core" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.629103 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="ceilometer-notification-agent" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.629180 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="sg-core" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.629271 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="ceilometer-central-agent" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.629346 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" containerName="proxy-httpd" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.634257 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.637641 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.638250 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.656796 4848 scope.go:117] "RemoveContainer" containerID="6f13c743a2180f40e1a49d5f44ea7ce80ad7ee7e36ff610088df0e9b95eaf9f8" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.658332 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.735193 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z4c7\" (UniqueName: \"kubernetes.io/projected/fb85487f-67a6-4596-86b5-5c6f18797527-kube-api-access-8z4c7\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.735402 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.735635 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-scripts\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.735670 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-run-httpd\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.735760 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.735815 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-log-httpd\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.735838 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-config-data\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.837566 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z4c7\" (UniqueName: \"kubernetes.io/projected/fb85487f-67a6-4596-86b5-5c6f18797527-kube-api-access-8z4c7\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.837639 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.837750 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-scripts\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.837773 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-run-httpd\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.837804 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.837838 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-log-httpd\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.837858 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-config-data\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.838719 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-log-httpd\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.838769 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-run-httpd\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.842469 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.843297 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-config-data\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.854752 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.865434 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z4c7\" (UniqueName: \"kubernetes.io/projected/fb85487f-67a6-4596-86b5-5c6f18797527-kube-api-access-8z4c7\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.866381 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-scripts\") pod \"ceilometer-0\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " pod="openstack/ceilometer-0" Dec 06 15:50:37 crc kubenswrapper[4848]: I1206 15:50:37.956799 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:50:38 crc kubenswrapper[4848]: I1206 15:50:38.429118 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:50:38 crc kubenswrapper[4848]: W1206 15:50:38.443480 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb85487f_67a6_4596_86b5_5c6f18797527.slice/crio-c7b3b227c3dd44ec1b7f92194a3cc785115f4a9e56714991fb6d2d4a8eae9af0 WatchSource:0}: Error finding container c7b3b227c3dd44ec1b7f92194a3cc785115f4a9e56714991fb6d2d4a8eae9af0: Status 404 returned error can't find the container with id c7b3b227c3dd44ec1b7f92194a3cc785115f4a9e56714991fb6d2d4a8eae9af0 Dec 06 15:50:38 crc kubenswrapper[4848]: I1206 15:50:38.536768 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb85487f-67a6-4596-86b5-5c6f18797527","Type":"ContainerStarted","Data":"c7b3b227c3dd44ec1b7f92194a3cc785115f4a9e56714991fb6d2d4a8eae9af0"} Dec 06 15:50:38 crc kubenswrapper[4848]: I1206 15:50:38.904203 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:50:38 crc kubenswrapper[4848]: I1206 15:50:38.981319 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107f62d3-cd13-45b0-8010-9e90340e069c" path="/var/lib/kubelet/pods/107f62d3-cd13-45b0-8010-9e90340e069c/volumes" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.067852 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-config-data\") pod \"95a42d59-df8c-420d-bb24-c8476a868dd9\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.068042 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2ptl\" (UniqueName: \"kubernetes.io/projected/95a42d59-df8c-420d-bb24-c8476a868dd9-kube-api-access-k2ptl\") pod \"95a42d59-df8c-420d-bb24-c8476a868dd9\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.068069 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-combined-ca-bundle\") pod \"95a42d59-df8c-420d-bb24-c8476a868dd9\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.068099 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-scripts\") pod \"95a42d59-df8c-420d-bb24-c8476a868dd9\" (UID: \"95a42d59-df8c-420d-bb24-c8476a868dd9\") " Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.072233 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-scripts" (OuterVolumeSpecName: "scripts") pod "95a42d59-df8c-420d-bb24-c8476a868dd9" (UID: "95a42d59-df8c-420d-bb24-c8476a868dd9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.072283 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a42d59-df8c-420d-bb24-c8476a868dd9-kube-api-access-k2ptl" (OuterVolumeSpecName: "kube-api-access-k2ptl") pod "95a42d59-df8c-420d-bb24-c8476a868dd9" (UID: "95a42d59-df8c-420d-bb24-c8476a868dd9"). InnerVolumeSpecName "kube-api-access-k2ptl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.094343 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-config-data" (OuterVolumeSpecName: "config-data") pod "95a42d59-df8c-420d-bb24-c8476a868dd9" (UID: "95a42d59-df8c-420d-bb24-c8476a868dd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.098494 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95a42d59-df8c-420d-bb24-c8476a868dd9" (UID: "95a42d59-df8c-420d-bb24-c8476a868dd9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.170860 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2ptl\" (UniqueName: \"kubernetes.io/projected/95a42d59-df8c-420d-bb24-c8476a868dd9-kube-api-access-k2ptl\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.170896 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.170906 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.170917 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a42d59-df8c-420d-bb24-c8476a868dd9-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.546480 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb85487f-67a6-4596-86b5-5c6f18797527","Type":"ContainerStarted","Data":"ec02952907aa06b02890acf47dfd39efa4bdd7ad974ba0c683bea2066623db01"} Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.550669 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-76qm4" event={"ID":"95a42d59-df8c-420d-bb24-c8476a868dd9","Type":"ContainerDied","Data":"0deb40f6f166e9a45d629f5b9ca7f976b01796555e2e64b7ceb91f671a89d440"} Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.550752 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0deb40f6f166e9a45d629f5b9ca7f976b01796555e2e64b7ceb91f671a89d440" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.550842 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-76qm4" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.667792 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 15:50:39 crc kubenswrapper[4848]: E1206 15:50:39.668250 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a42d59-df8c-420d-bb24-c8476a868dd9" containerName="nova-cell0-conductor-db-sync" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.668274 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a42d59-df8c-420d-bb24-c8476a868dd9" containerName="nova-cell0-conductor-db-sync" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.668521 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a42d59-df8c-420d-bb24-c8476a868dd9" containerName="nova-cell0-conductor-db-sync" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.669286 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.672675 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6lzkc" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.672942 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.687940 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.782183 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzf4c\" (UniqueName: \"kubernetes.io/projected/ea901ebd-9d73-4ba0-8448-634e2b3d17f7-kube-api-access-dzf4c\") pod \"nova-cell0-conductor-0\" (UID: \"ea901ebd-9d73-4ba0-8448-634e2b3d17f7\") " pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.782647 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea901ebd-9d73-4ba0-8448-634e2b3d17f7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ea901ebd-9d73-4ba0-8448-634e2b3d17f7\") " pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.782680 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea901ebd-9d73-4ba0-8448-634e2b3d17f7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ea901ebd-9d73-4ba0-8448-634e2b3d17f7\") " pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.894633 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzf4c\" (UniqueName: \"kubernetes.io/projected/ea901ebd-9d73-4ba0-8448-634e2b3d17f7-kube-api-access-dzf4c\") pod \"nova-cell0-conductor-0\" (UID: \"ea901ebd-9d73-4ba0-8448-634e2b3d17f7\") " pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.895157 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea901ebd-9d73-4ba0-8448-634e2b3d17f7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ea901ebd-9d73-4ba0-8448-634e2b3d17f7\") " pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.895302 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea901ebd-9d73-4ba0-8448-634e2b3d17f7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ea901ebd-9d73-4ba0-8448-634e2b3d17f7\") " pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.900989 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea901ebd-9d73-4ba0-8448-634e2b3d17f7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ea901ebd-9d73-4ba0-8448-634e2b3d17f7\") " pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.901584 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea901ebd-9d73-4ba0-8448-634e2b3d17f7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ea901ebd-9d73-4ba0-8448-634e2b3d17f7\") " pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.917211 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzf4c\" (UniqueName: \"kubernetes.io/projected/ea901ebd-9d73-4ba0-8448-634e2b3d17f7-kube-api-access-dzf4c\") pod \"nova-cell0-conductor-0\" (UID: \"ea901ebd-9d73-4ba0-8448-634e2b3d17f7\") " pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:39 crc kubenswrapper[4848]: I1206 15:50:39.986649 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:40 crc kubenswrapper[4848]: W1206 15:50:40.418208 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea901ebd_9d73_4ba0_8448_634e2b3d17f7.slice/crio-682ddcf1a0b76879a13041e4c01c35eaa36daf2db3ff6f33285aed4c90fedc52 WatchSource:0}: Error finding container 682ddcf1a0b76879a13041e4c01c35eaa36daf2db3ff6f33285aed4c90fedc52: Status 404 returned error can't find the container with id 682ddcf1a0b76879a13041e4c01c35eaa36daf2db3ff6f33285aed4c90fedc52 Dec 06 15:50:40 crc kubenswrapper[4848]: I1206 15:50:40.421531 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 15:50:40 crc kubenswrapper[4848]: I1206 15:50:40.564831 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ea901ebd-9d73-4ba0-8448-634e2b3d17f7","Type":"ContainerStarted","Data":"682ddcf1a0b76879a13041e4c01c35eaa36daf2db3ff6f33285aed4c90fedc52"} Dec 06 15:50:40 crc kubenswrapper[4848]: I1206 15:50:40.566809 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb85487f-67a6-4596-86b5-5c6f18797527","Type":"ContainerStarted","Data":"023d2db23ce2bb90a41ac0c248dfd419e5a789661b3ba84fb98446a3a3fb7085"} Dec 06 15:50:41 crc kubenswrapper[4848]: I1206 15:50:41.579964 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ea901ebd-9d73-4ba0-8448-634e2b3d17f7","Type":"ContainerStarted","Data":"4e9ce6e0e3f8bcc9ac719e4cbb24ba012477433a41747d4cb4bbaea65d7141c6"} Dec 06 15:50:41 crc kubenswrapper[4848]: I1206 15:50:41.580889 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:41 crc kubenswrapper[4848]: I1206 15:50:41.607231 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.607213169 podStartE2EDuration="2.607213169s" podCreationTimestamp="2025-12-06 15:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:50:41.59798821 +0000 UTC m=+1308.895999123" watchObservedRunningTime="2025-12-06 15:50:41.607213169 +0000 UTC m=+1308.905224082" Dec 06 15:50:41 crc kubenswrapper[4848]: I1206 15:50:41.887680 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Dec 06 15:50:41 crc kubenswrapper[4848]: I1206 15:50:41.887743 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Dec 06 15:50:41 crc kubenswrapper[4848]: I1206 15:50:41.918008 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Dec 06 15:50:41 crc kubenswrapper[4848]: I1206 15:50:41.921146 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Dec 06 15:50:42 crc kubenswrapper[4848]: I1206 15:50:42.599140 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Dec 06 15:50:42 crc kubenswrapper[4848]: I1206 15:50:42.601772 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Dec 06 15:50:45 crc kubenswrapper[4848]: I1206 15:50:45.652778 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb85487f-67a6-4596-86b5-5c6f18797527","Type":"ContainerStarted","Data":"6f1d0ee1f506c31e0d0460bd13963b1647513852d3905cdc837f805c0d6f432b"} Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.150612 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.150907 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.150947 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.151582 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a1d1fbb58852277f10718bb790d5a1cff7eb412840195878f28ff1bcf501416"} pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.151635 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" containerID="cri-o://1a1d1fbb58852277f10718bb790d5a1cff7eb412840195878f28ff1bcf501416" gracePeriod=600 Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.679970 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb85487f-67a6-4596-86b5-5c6f18797527","Type":"ContainerStarted","Data":"adfad597cc41baeb5d080038455d5937394910bfeae368e714cd413683773790"} Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.680317 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.683815 4848 generic.go:334] "Generic (PLEG): container finished" podID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerID="1a1d1fbb58852277f10718bb790d5a1cff7eb412840195878f28ff1bcf501416" exitCode=0 Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.683862 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerDied","Data":"1a1d1fbb58852277f10718bb790d5a1cff7eb412840195878f28ff1bcf501416"} Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.683885 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8"} Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.683901 4848 scope.go:117] "RemoveContainer" containerID="7145cee45c506bc8604a623c0766622691ca486056cd069a6687b453e59facaa" Dec 06 15:50:47 crc kubenswrapper[4848]: I1206 15:50:47.702560 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.628548773 podStartE2EDuration="10.702542322s" podCreationTimestamp="2025-12-06 15:50:37 +0000 UTC" firstStartedPulling="2025-12-06 15:50:38.446524158 +0000 UTC m=+1305.744535081" lastFinishedPulling="2025-12-06 15:50:46.520517717 +0000 UTC m=+1313.818528630" observedRunningTime="2025-12-06 15:50:47.701236147 +0000 UTC m=+1314.999247060" watchObservedRunningTime="2025-12-06 15:50:47.702542322 +0000 UTC m=+1315.000553235" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.013913 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.426846 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-8k5vr"] Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.428140 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.430304 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.437374 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8k5vr"] Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.441151 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.584900 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-config-data\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.585237 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd9wp\" (UniqueName: \"kubernetes.io/projected/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-kube-api-access-gd9wp\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.585298 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-scripts\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.585388 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.633853 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.635544 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.646918 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.653049 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.678909 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.684991 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.686720 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9wp\" (UniqueName: \"kubernetes.io/projected/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-kube-api-access-gd9wp\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.686789 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-scripts\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.686840 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.686935 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-config-data\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.694349 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.711480 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.714195 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-scripts\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.732492 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-config-data\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.754796 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.766228 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd9wp\" (UniqueName: \"kubernetes.io/projected/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-kube-api-access-gd9wp\") pod \"nova-cell0-cell-mapping-8k5vr\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.786069 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.812170 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.813850 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mklv\" (UniqueName: \"kubernetes.io/projected/538d9d51-0ea2-48c7-9428-0879a99fa5c0-kube-api-access-8mklv\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.813910 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.814015 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/538d9d51-0ea2-48c7-9428-0879a99fa5c0-logs\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.814063 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.814097 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-config-data\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.814134 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-config-data\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.814160 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38c0cf0f-a169-4585-9922-00437f53db61-logs\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.814180 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k89xj\" (UniqueName: \"kubernetes.io/projected/38c0cf0f-a169-4585-9922-00437f53db61-kube-api-access-k89xj\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.820138 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.907375 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.915809 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mklv\" (UniqueName: \"kubernetes.io/projected/538d9d51-0ea2-48c7-9428-0879a99fa5c0-kube-api-access-8mklv\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.915858 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.915894 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-config-data\") pod \"nova-scheduler-0\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " pod="openstack/nova-scheduler-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.915936 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krmt9\" (UniqueName: \"kubernetes.io/projected/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-kube-api-access-krmt9\") pod \"nova-scheduler-0\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " pod="openstack/nova-scheduler-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.916014 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/538d9d51-0ea2-48c7-9428-0879a99fa5c0-logs\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.916048 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.916073 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-config-data\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.916095 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " pod="openstack/nova-scheduler-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.916119 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-config-data\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.916140 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38c0cf0f-a169-4585-9922-00437f53db61-logs\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.916249 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k89xj\" (UniqueName: \"kubernetes.io/projected/38c0cf0f-a169-4585-9922-00437f53db61-kube-api-access-k89xj\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.921914 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/538d9d51-0ea2-48c7-9428-0879a99fa5c0-logs\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.924299 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38c0cf0f-a169-4585-9922-00437f53db61-logs\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.926222 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-config-data\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.928611 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-config-data\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.928800 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mjqzp"] Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.930480 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.936442 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.938317 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.949908 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k89xj\" (UniqueName: \"kubernetes.io/projected/38c0cf0f-a169-4585-9922-00437f53db61-kube-api-access-k89xj\") pod \"nova-api-0\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.955533 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.956915 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.963609 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.973843 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.974132 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mklv\" (UniqueName: \"kubernetes.io/projected/538d9d51-0ea2-48c7-9428-0879a99fa5c0-kube-api-access-8mklv\") pod \"nova-metadata-0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " pod="openstack/nova-metadata-0" Dec 06 15:50:50 crc kubenswrapper[4848]: I1206 15:50:50.999515 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.011824 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mjqzp"] Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.019521 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.022034 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsxc6\" (UniqueName: \"kubernetes.io/projected/54827637-22f6-42b6-afe6-bbb21ae65924-kube-api-access-nsxc6\") pod \"nova-cell1-novncproxy-0\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.022165 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-config-data\") pod \"nova-scheduler-0\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " pod="openstack/nova-scheduler-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.022243 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krmt9\" (UniqueName: \"kubernetes.io/projected/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-kube-api-access-krmt9\") pod \"nova-scheduler-0\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " pod="openstack/nova-scheduler-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.022322 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-svc\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.022435 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.023041 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4nbf\" (UniqueName: \"kubernetes.io/projected/21ef0ec0-5ab7-4256-920f-da903c1e4548-kube-api-access-t4nbf\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.023450 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.023564 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-config\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.023676 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.024005 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " pod="openstack/nova-scheduler-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.024161 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.042228 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-config-data\") pod \"nova-scheduler-0\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " pod="openstack/nova-scheduler-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.045023 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.051170 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krmt9\" (UniqueName: \"kubernetes.io/projected/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-kube-api-access-krmt9\") pod \"nova-scheduler-0\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " pod="openstack/nova-scheduler-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.054431 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " pod="openstack/nova-scheduler-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.129372 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.129681 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.129717 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsxc6\" (UniqueName: \"kubernetes.io/projected/54827637-22f6-42b6-afe6-bbb21ae65924-kube-api-access-nsxc6\") pod \"nova-cell1-novncproxy-0\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.129738 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-svc\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.129779 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.129956 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4nbf\" (UniqueName: \"kubernetes.io/projected/21ef0ec0-5ab7-4256-920f-da903c1e4548-kube-api-access-t4nbf\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.132364 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.132412 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-config\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.132448 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.133404 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.133811 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-svc\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.133861 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.134499 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-config\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.134529 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.140424 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.141496 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.155050 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4nbf\" (UniqueName: \"kubernetes.io/projected/21ef0ec0-5ab7-4256-920f-da903c1e4548-kube-api-access-t4nbf\") pod \"dnsmasq-dns-865f5d856f-mjqzp\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.155674 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsxc6\" (UniqueName: \"kubernetes.io/projected/54827637-22f6-42b6-afe6-bbb21ae65924-kube-api-access-nsxc6\") pod \"nova-cell1-novncproxy-0\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.194954 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.213037 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.373923 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.396519 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.491467 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.661567 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8hbwm"] Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.665475 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.675444 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.675493 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.684187 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8hbwm"] Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.695840 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8k5vr"] Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.745012 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nq9m\" (UniqueName: \"kubernetes.io/projected/d719bb86-9c8a-47a5-9b01-010f0ac07dac-kube-api-access-4nq9m\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.745086 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.745177 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-config-data\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.745259 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-scripts\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.804083 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.840528 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38c0cf0f-a169-4585-9922-00437f53db61","Type":"ContainerStarted","Data":"3b35df775fbe6d9f2362ab2e0226e24f0606becebb2369979ea5e37de51f776a"} Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.845838 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8k5vr" event={"ID":"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f","Type":"ContainerStarted","Data":"a1410770e66756e7c0910d86f61303202e6559ec56e043e4e56e2913505c3c1f"} Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.846421 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nq9m\" (UniqueName: \"kubernetes.io/projected/d719bb86-9c8a-47a5-9b01-010f0ac07dac-kube-api-access-4nq9m\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.846461 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.846501 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-config-data\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.846527 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-scripts\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.853389 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"538d9d51-0ea2-48c7-9428-0879a99fa5c0","Type":"ContainerStarted","Data":"f0530d1e5cb8a943602f920c5a25f31ce8c3f55bea0cb7ce1b849640cf270a59"} Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.853869 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.857733 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-scripts\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.873179 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.896483 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nq9m\" (UniqueName: \"kubernetes.io/projected/d719bb86-9c8a-47a5-9b01-010f0ac07dac-kube-api-access-4nq9m\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:51 crc kubenswrapper[4848]: I1206 15:50:51.899424 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-config-data\") pod \"nova-cell1-conductor-db-sync-8hbwm\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.003116 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.059194 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.070245 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mjqzp"] Dec 06 15:50:52 crc kubenswrapper[4848]: W1206 15:50:52.081240 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54827637_22f6_42b6_afe6_bbb21ae65924.slice/crio-2f47f9bfba5961e9702e7308d1c27943ea17a137eb6ff7656f7d55db6a72dc03 WatchSource:0}: Error finding container 2f47f9bfba5961e9702e7308d1c27943ea17a137eb6ff7656f7d55db6a72dc03: Status 404 returned error can't find the container with id 2f47f9bfba5961e9702e7308d1c27943ea17a137eb6ff7656f7d55db6a72dc03 Dec 06 15:50:52 crc kubenswrapper[4848]: W1206 15:50:52.082562 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21ef0ec0_5ab7_4256_920f_da903c1e4548.slice/crio-560c6010daac26c16221018cd8feeea3067d40b90c8e83aa06588a7b6e5243b6 WatchSource:0}: Error finding container 560c6010daac26c16221018cd8feeea3067d40b90c8e83aa06588a7b6e5243b6: Status 404 returned error can't find the container with id 560c6010daac26c16221018cd8feeea3067d40b90c8e83aa06588a7b6e5243b6 Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.600406 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8hbwm"] Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.865936 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8k5vr" event={"ID":"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f","Type":"ContainerStarted","Data":"e37b77da47c99c43de099f8dcc3f0726f7596dbe4eeb8ca1761bc8c082d88f49"} Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.867370 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8hbwm" event={"ID":"d719bb86-9c8a-47a5-9b01-010f0ac07dac","Type":"ContainerStarted","Data":"9f3176b572353df94600c9701b12dd24e9bbcba39ab08dd82502008f9f761560"} Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.869583 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"54827637-22f6-42b6-afe6-bbb21ae65924","Type":"ContainerStarted","Data":"2f47f9bfba5961e9702e7308d1c27943ea17a137eb6ff7656f7d55db6a72dc03"} Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.873529 4848 generic.go:334] "Generic (PLEG): container finished" podID="21ef0ec0-5ab7-4256-920f-da903c1e4548" containerID="4b5fe1ac7296c94ca393ab01fad2a2276f2192c9a4e7077ce20e288add87c128" exitCode=0 Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.873585 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" event={"ID":"21ef0ec0-5ab7-4256-920f-da903c1e4548","Type":"ContainerDied","Data":"4b5fe1ac7296c94ca393ab01fad2a2276f2192c9a4e7077ce20e288add87c128"} Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.873838 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" event={"ID":"21ef0ec0-5ab7-4256-920f-da903c1e4548","Type":"ContainerStarted","Data":"560c6010daac26c16221018cd8feeea3067d40b90c8e83aa06588a7b6e5243b6"} Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.877961 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"13bdfbbb-3bd3-41ad-8cf7-7c2825244824","Type":"ContainerStarted","Data":"b44b400d000ccf3c84ed255041940eb8b8ac0ec70357c8c098b8ba94319b561d"} Dec 06 15:50:52 crc kubenswrapper[4848]: I1206 15:50:52.886049 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-8k5vr" podStartSLOduration=2.886030527 podStartE2EDuration="2.886030527s" podCreationTimestamp="2025-12-06 15:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:50:52.879175022 +0000 UTC m=+1320.177185945" watchObservedRunningTime="2025-12-06 15:50:52.886030527 +0000 UTC m=+1320.184041440" Dec 06 15:50:53 crc kubenswrapper[4848]: I1206 15:50:53.899544 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8hbwm" event={"ID":"d719bb86-9c8a-47a5-9b01-010f0ac07dac","Type":"ContainerStarted","Data":"f6e1852ba42c8d6702ad8fb8a52d9026c79f96bf5ac698a3ad19be532e4185de"} Dec 06 15:50:53 crc kubenswrapper[4848]: I1206 15:50:53.915056 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8hbwm" podStartSLOduration=2.9150343850000002 podStartE2EDuration="2.915034385s" podCreationTimestamp="2025-12-06 15:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:50:53.912540207 +0000 UTC m=+1321.210551130" watchObservedRunningTime="2025-12-06 15:50:53.915034385 +0000 UTC m=+1321.213045298" Dec 06 15:50:55 crc kubenswrapper[4848]: I1206 15:50:55.408580 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 15:50:55 crc kubenswrapper[4848]: I1206 15:50:55.425529 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.936955 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" event={"ID":"21ef0ec0-5ab7-4256-920f-da903c1e4548","Type":"ContainerStarted","Data":"6c91bc25ae68e9b53e31425455d64d2116673f043295ae1291a349325e3df008"} Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.938972 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.940878 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38c0cf0f-a169-4585-9922-00437f53db61","Type":"ContainerStarted","Data":"a0360a4059ffa229c437af1d3beae964e528311b5d83e103137c08020182a97e"} Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.941009 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38c0cf0f-a169-4585-9922-00437f53db61","Type":"ContainerStarted","Data":"e9675ba1442276ce233e746b947635987da20566e460835ead1c80f0f3f2b249"} Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.943031 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"13bdfbbb-3bd3-41ad-8cf7-7c2825244824","Type":"ContainerStarted","Data":"4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7"} Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.946653 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"538d9d51-0ea2-48c7-9428-0879a99fa5c0","Type":"ContainerStarted","Data":"ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223"} Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.946707 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"538d9d51-0ea2-48c7-9428-0879a99fa5c0","Type":"ContainerStarted","Data":"5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a"} Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.946760 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="538d9d51-0ea2-48c7-9428-0879a99fa5c0" containerName="nova-metadata-log" containerID="cri-o://5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a" gracePeriod=30 Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.946828 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="538d9d51-0ea2-48c7-9428-0879a99fa5c0" containerName="nova-metadata-metadata" containerID="cri-o://ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223" gracePeriod=30 Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.948947 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"54827637-22f6-42b6-afe6-bbb21ae65924","Type":"ContainerStarted","Data":"f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a"} Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.949187 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="54827637-22f6-42b6-afe6-bbb21ae65924" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a" gracePeriod=30 Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.962280 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" podStartSLOduration=6.962258115 podStartE2EDuration="6.962258115s" podCreationTimestamp="2025-12-06 15:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:50:56.960270481 +0000 UTC m=+1324.258281394" watchObservedRunningTime="2025-12-06 15:50:56.962258115 +0000 UTC m=+1324.260269028" Dec 06 15:50:56 crc kubenswrapper[4848]: I1206 15:50:56.990834 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6772455649999998 podStartE2EDuration="6.990812005s" podCreationTimestamp="2025-12-06 15:50:50 +0000 UTC" firstStartedPulling="2025-12-06 15:50:51.525335412 +0000 UTC m=+1318.823346325" lastFinishedPulling="2025-12-06 15:50:55.838901852 +0000 UTC m=+1323.136912765" observedRunningTime="2025-12-06 15:50:56.982004217 +0000 UTC m=+1324.280015130" watchObservedRunningTime="2025-12-06 15:50:56.990812005 +0000 UTC m=+1324.288822918" Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.001305 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.009086857 podStartE2EDuration="7.001287397s" podCreationTimestamp="2025-12-06 15:50:50 +0000 UTC" firstStartedPulling="2025-12-06 15:50:51.846736502 +0000 UTC m=+1319.144747415" lastFinishedPulling="2025-12-06 15:50:55.838937042 +0000 UTC m=+1323.136947955" observedRunningTime="2025-12-06 15:50:56.994833073 +0000 UTC m=+1324.292843976" watchObservedRunningTime="2025-12-06 15:50:57.001287397 +0000 UTC m=+1324.299298310" Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.014909 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.268074453 podStartE2EDuration="7.014889104s" podCreationTimestamp="2025-12-06 15:50:50 +0000 UTC" firstStartedPulling="2025-12-06 15:50:52.090412726 +0000 UTC m=+1319.388423639" lastFinishedPulling="2025-12-06 15:50:55.837227377 +0000 UTC m=+1323.135238290" observedRunningTime="2025-12-06 15:50:57.009846148 +0000 UTC m=+1324.307857061" watchObservedRunningTime="2025-12-06 15:50:57.014889104 +0000 UTC m=+1324.312900027" Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.033043 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.031166802 podStartE2EDuration="7.033022063s" podCreationTimestamp="2025-12-06 15:50:50 +0000 UTC" firstStartedPulling="2025-12-06 15:50:51.837088692 +0000 UTC m=+1319.135099605" lastFinishedPulling="2025-12-06 15:50:55.838943953 +0000 UTC m=+1323.136954866" observedRunningTime="2025-12-06 15:50:57.027745921 +0000 UTC m=+1324.325756844" watchObservedRunningTime="2025-12-06 15:50:57.033022063 +0000 UTC m=+1324.331032976" Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.563717 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.675174 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-config-data\") pod \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.675314 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/538d9d51-0ea2-48c7-9428-0879a99fa5c0-logs\") pod \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.675493 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-combined-ca-bundle\") pod \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.675523 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mklv\" (UniqueName: \"kubernetes.io/projected/538d9d51-0ea2-48c7-9428-0879a99fa5c0-kube-api-access-8mklv\") pod \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.676634 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/538d9d51-0ea2-48c7-9428-0879a99fa5c0-logs" (OuterVolumeSpecName: "logs") pod "538d9d51-0ea2-48c7-9428-0879a99fa5c0" (UID: "538d9d51-0ea2-48c7-9428-0879a99fa5c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.681987 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/538d9d51-0ea2-48c7-9428-0879a99fa5c0-kube-api-access-8mklv" (OuterVolumeSpecName: "kube-api-access-8mklv") pod "538d9d51-0ea2-48c7-9428-0879a99fa5c0" (UID: "538d9d51-0ea2-48c7-9428-0879a99fa5c0"). InnerVolumeSpecName "kube-api-access-8mklv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:50:57 crc kubenswrapper[4848]: E1206 15:50:57.708079 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-combined-ca-bundle podName:538d9d51-0ea2-48c7-9428-0879a99fa5c0 nodeName:}" failed. No retries permitted until 2025-12-06 15:50:58.208029312 +0000 UTC m=+1325.506040225 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-combined-ca-bundle") pod "538d9d51-0ea2-48c7-9428-0879a99fa5c0" (UID: "538d9d51-0ea2-48c7-9428-0879a99fa5c0") : error deleting /var/lib/kubelet/pods/538d9d51-0ea2-48c7-9428-0879a99fa5c0/volume-subpaths: remove /var/lib/kubelet/pods/538d9d51-0ea2-48c7-9428-0879a99fa5c0/volume-subpaths: no such file or directory Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.710351 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-config-data" (OuterVolumeSpecName: "config-data") pod "538d9d51-0ea2-48c7-9428-0879a99fa5c0" (UID: "538d9d51-0ea2-48c7-9428-0879a99fa5c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.777938 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.778234 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/538d9d51-0ea2-48c7-9428-0879a99fa5c0-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.778298 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mklv\" (UniqueName: \"kubernetes.io/projected/538d9d51-0ea2-48c7-9428-0879a99fa5c0-kube-api-access-8mklv\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.968062 4848 generic.go:334] "Generic (PLEG): container finished" podID="538d9d51-0ea2-48c7-9428-0879a99fa5c0" containerID="ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223" exitCode=0 Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.968095 4848 generic.go:334] "Generic (PLEG): container finished" podID="538d9d51-0ea2-48c7-9428-0879a99fa5c0" containerID="5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a" exitCode=143 Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.968197 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"538d9d51-0ea2-48c7-9428-0879a99fa5c0","Type":"ContainerDied","Data":"ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223"} Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.968236 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"538d9d51-0ea2-48c7-9428-0879a99fa5c0","Type":"ContainerDied","Data":"5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a"} Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.968247 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"538d9d51-0ea2-48c7-9428-0879a99fa5c0","Type":"ContainerDied","Data":"f0530d1e5cb8a943602f920c5a25f31ce8c3f55bea0cb7ce1b849640cf270a59"} Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.968278 4848 scope.go:117] "RemoveContainer" containerID="ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223" Dec 06 15:50:57 crc kubenswrapper[4848]: I1206 15:50:57.969058 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.045373 4848 scope.go:117] "RemoveContainer" containerID="5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.067400 4848 scope.go:117] "RemoveContainer" containerID="ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223" Dec 06 15:50:58 crc kubenswrapper[4848]: E1206 15:50:58.067881 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223\": container with ID starting with ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223 not found: ID does not exist" containerID="ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.067914 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223"} err="failed to get container status \"ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223\": rpc error: code = NotFound desc = could not find container \"ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223\": container with ID starting with ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223 not found: ID does not exist" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.067934 4848 scope.go:117] "RemoveContainer" containerID="5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a" Dec 06 15:50:58 crc kubenswrapper[4848]: E1206 15:50:58.068174 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a\": container with ID starting with 5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a not found: ID does not exist" containerID="5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.068195 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a"} err="failed to get container status \"5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a\": rpc error: code = NotFound desc = could not find container \"5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a\": container with ID starting with 5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a not found: ID does not exist" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.068208 4848 scope.go:117] "RemoveContainer" containerID="ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.068412 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223"} err="failed to get container status \"ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223\": rpc error: code = NotFound desc = could not find container \"ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223\": container with ID starting with ede6701a6d29e135a449844b453fba6ba8c64d42dc543a6304ab0f16bfd27223 not found: ID does not exist" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.068430 4848 scope.go:117] "RemoveContainer" containerID="5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.068608 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a"} err="failed to get container status \"5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a\": rpc error: code = NotFound desc = could not find container \"5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a\": container with ID starting with 5faedc8f35b9bc6dca796499af1134fe23d2bd3111ab6280745103c9ce48c32a not found: ID does not exist" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.289405 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-combined-ca-bundle\") pod \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\" (UID: \"538d9d51-0ea2-48c7-9428-0879a99fa5c0\") " Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.297882 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "538d9d51-0ea2-48c7-9428-0879a99fa5c0" (UID: "538d9d51-0ea2-48c7-9428-0879a99fa5c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.392474 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/538d9d51-0ea2-48c7-9428-0879a99fa5c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.600642 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.647163 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.655873 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:50:58 crc kubenswrapper[4848]: E1206 15:50:58.656444 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="538d9d51-0ea2-48c7-9428-0879a99fa5c0" containerName="nova-metadata-log" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.656546 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="538d9d51-0ea2-48c7-9428-0879a99fa5c0" containerName="nova-metadata-log" Dec 06 15:50:58 crc kubenswrapper[4848]: E1206 15:50:58.656623 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="538d9d51-0ea2-48c7-9428-0879a99fa5c0" containerName="nova-metadata-metadata" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.656682 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="538d9d51-0ea2-48c7-9428-0879a99fa5c0" containerName="nova-metadata-metadata" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.657076 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="538d9d51-0ea2-48c7-9428-0879a99fa5c0" containerName="nova-metadata-metadata" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.657163 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="538d9d51-0ea2-48c7-9428-0879a99fa5c0" containerName="nova-metadata-log" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.658271 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.660234 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.660492 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.679752 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.698085 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b6fda35-6156-4081-bcec-6315368c6ea9-logs\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.698304 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.698357 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-config-data\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.698438 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5l6n\" (UniqueName: \"kubernetes.io/projected/7b6fda35-6156-4081-bcec-6315368c6ea9-kube-api-access-l5l6n\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.698510 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.800220 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.800325 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b6fda35-6156-4081-bcec-6315368c6ea9-logs\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.800431 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.800489 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-config-data\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.800573 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5l6n\" (UniqueName: \"kubernetes.io/projected/7b6fda35-6156-4081-bcec-6315368c6ea9-kube-api-access-l5l6n\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.801157 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b6fda35-6156-4081-bcec-6315368c6ea9-logs\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.805900 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.806599 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-config-data\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.815489 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.816371 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5l6n\" (UniqueName: \"kubernetes.io/projected/7b6fda35-6156-4081-bcec-6315368c6ea9-kube-api-access-l5l6n\") pod \"nova-metadata-0\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " pod="openstack/nova-metadata-0" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.977191 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="538d9d51-0ea2-48c7-9428-0879a99fa5c0" path="/var/lib/kubelet/pods/538d9d51-0ea2-48c7-9428-0879a99fa5c0/volumes" Dec 06 15:50:58 crc kubenswrapper[4848]: I1206 15:50:58.979302 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:50:59 crc kubenswrapper[4848]: I1206 15:50:59.689611 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:50:59 crc kubenswrapper[4848]: I1206 15:50:59.992144 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b6fda35-6156-4081-bcec-6315368c6ea9","Type":"ContainerStarted","Data":"5ee7f46c6fdad0970458ef2eb7e8abae5d5cd3c558628c724bd44cb2584cfcee"} Dec 06 15:51:00 crc kubenswrapper[4848]: I1206 15:51:00.964955 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 15:51:00 crc kubenswrapper[4848]: I1206 15:51:00.965387 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.013662 4848 generic.go:334] "Generic (PLEG): container finished" podID="aaf8dae4-61fd-4899-a2ea-07e6277f8c3f" containerID="e37b77da47c99c43de099f8dcc3f0726f7596dbe4eeb8ca1761bc8c082d88f49" exitCode=0 Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.013758 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8k5vr" event={"ID":"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f","Type":"ContainerDied","Data":"e37b77da47c99c43de099f8dcc3f0726f7596dbe4eeb8ca1761bc8c082d88f49"} Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.018117 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b6fda35-6156-4081-bcec-6315368c6ea9","Type":"ContainerStarted","Data":"fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9"} Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.018144 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b6fda35-6156-4081-bcec-6315368c6ea9","Type":"ContainerStarted","Data":"6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd"} Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.063050 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.063030403 podStartE2EDuration="3.063030403s" podCreationTimestamp="2025-12-06 15:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:01.048757068 +0000 UTC m=+1328.346767991" watchObservedRunningTime="2025-12-06 15:51:01.063030403 +0000 UTC m=+1328.361041316" Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.196477 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.196530 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.227085 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.375946 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.399678 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.482266 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-wq7q5"] Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.482814 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" podUID="5e6a9170-ec3f-4704-8526-7736f298a496" containerName="dnsmasq-dns" containerID="cri-o://90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428" gracePeriod=10 Dec 06 15:51:01 crc kubenswrapper[4848]: I1206 15:51:01.574324 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" podUID="5e6a9170-ec3f-4704-8526-7736f298a496" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: connect: connection refused" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.007688 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.046867 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38c0cf0f-a169-4585-9922-00437f53db61" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.046898 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38c0cf0f-a169-4585-9922-00437f53db61" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.048090 4848 generic.go:334] "Generic (PLEG): container finished" podID="5e6a9170-ec3f-4704-8526-7736f298a496" containerID="90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428" exitCode=0 Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.048128 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" event={"ID":"5e6a9170-ec3f-4704-8526-7736f298a496","Type":"ContainerDied","Data":"90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428"} Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.048159 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" event={"ID":"5e6a9170-ec3f-4704-8526-7736f298a496","Type":"ContainerDied","Data":"5f91a90dc98aa7a52ee294a65505e3b0c51b0a8f3f549a0a35870f506604f587"} Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.048170 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-wq7q5" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.048174 4848 scope.go:117] "RemoveContainer" containerID="90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.079040 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-svc\") pod \"5e6a9170-ec3f-4704-8526-7736f298a496\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.079225 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-swift-storage-0\") pod \"5e6a9170-ec3f-4704-8526-7736f298a496\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.079263 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-sb\") pod \"5e6a9170-ec3f-4704-8526-7736f298a496\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.079314 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-nb\") pod \"5e6a9170-ec3f-4704-8526-7736f298a496\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.079377 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-config\") pod \"5e6a9170-ec3f-4704-8526-7736f298a496\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.079506 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26qr7\" (UniqueName: \"kubernetes.io/projected/5e6a9170-ec3f-4704-8526-7736f298a496-kube-api-access-26qr7\") pod \"5e6a9170-ec3f-4704-8526-7736f298a496\" (UID: \"5e6a9170-ec3f-4704-8526-7736f298a496\") " Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.087564 4848 scope.go:117] "RemoveContainer" containerID="190f41f79a70a579bd53ce0c19363fab0fa7aed835cabbf45296b6c04181423b" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.087841 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6a9170-ec3f-4704-8526-7736f298a496-kube-api-access-26qr7" (OuterVolumeSpecName: "kube-api-access-26qr7") pod "5e6a9170-ec3f-4704-8526-7736f298a496" (UID: "5e6a9170-ec3f-4704-8526-7736f298a496"). InnerVolumeSpecName "kube-api-access-26qr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.106235 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.141642 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e6a9170-ec3f-4704-8526-7736f298a496" (UID: "5e6a9170-ec3f-4704-8526-7736f298a496"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.164375 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-config" (OuterVolumeSpecName: "config") pod "5e6a9170-ec3f-4704-8526-7736f298a496" (UID: "5e6a9170-ec3f-4704-8526-7736f298a496"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.181970 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26qr7\" (UniqueName: \"kubernetes.io/projected/5e6a9170-ec3f-4704-8526-7736f298a496-kube-api-access-26qr7\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.181998 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.182010 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.190812 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e6a9170-ec3f-4704-8526-7736f298a496" (UID: "5e6a9170-ec3f-4704-8526-7736f298a496"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.190978 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e6a9170-ec3f-4704-8526-7736f298a496" (UID: "5e6a9170-ec3f-4704-8526-7736f298a496"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.205420 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5e6a9170-ec3f-4704-8526-7736f298a496" (UID: "5e6a9170-ec3f-4704-8526-7736f298a496"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.233256 4848 scope.go:117] "RemoveContainer" containerID="90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428" Dec 06 15:51:02 crc kubenswrapper[4848]: E1206 15:51:02.233919 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428\": container with ID starting with 90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428 not found: ID does not exist" containerID="90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.233981 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428"} err="failed to get container status \"90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428\": rpc error: code = NotFound desc = could not find container \"90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428\": container with ID starting with 90b17acfca66b7f7a058138d80b04cd69e4eef22a0bae693d72f4b26ab2ff428 not found: ID does not exist" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.234008 4848 scope.go:117] "RemoveContainer" containerID="190f41f79a70a579bd53ce0c19363fab0fa7aed835cabbf45296b6c04181423b" Dec 06 15:51:02 crc kubenswrapper[4848]: E1206 15:51:02.234371 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"190f41f79a70a579bd53ce0c19363fab0fa7aed835cabbf45296b6c04181423b\": container with ID starting with 190f41f79a70a579bd53ce0c19363fab0fa7aed835cabbf45296b6c04181423b not found: ID does not exist" containerID="190f41f79a70a579bd53ce0c19363fab0fa7aed835cabbf45296b6c04181423b" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.234406 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"190f41f79a70a579bd53ce0c19363fab0fa7aed835cabbf45296b6c04181423b"} err="failed to get container status \"190f41f79a70a579bd53ce0c19363fab0fa7aed835cabbf45296b6c04181423b\": rpc error: code = NotFound desc = could not find container \"190f41f79a70a579bd53ce0c19363fab0fa7aed835cabbf45296b6c04181423b\": container with ID starting with 190f41f79a70a579bd53ce0c19363fab0fa7aed835cabbf45296b6c04181423b not found: ID does not exist" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.284507 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.284545 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.284561 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e6a9170-ec3f-4704-8526-7736f298a496-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.387267 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-wq7q5"] Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.392661 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.395627 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-wq7q5"] Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.487975 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-combined-ca-bundle\") pod \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.488067 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd9wp\" (UniqueName: \"kubernetes.io/projected/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-kube-api-access-gd9wp\") pod \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.488230 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-config-data\") pod \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.488262 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-scripts\") pod \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\" (UID: \"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f\") " Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.492354 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-kube-api-access-gd9wp" (OuterVolumeSpecName: "kube-api-access-gd9wp") pod "aaf8dae4-61fd-4899-a2ea-07e6277f8c3f" (UID: "aaf8dae4-61fd-4899-a2ea-07e6277f8c3f"). InnerVolumeSpecName "kube-api-access-gd9wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.492959 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-scripts" (OuterVolumeSpecName: "scripts") pod "aaf8dae4-61fd-4899-a2ea-07e6277f8c3f" (UID: "aaf8dae4-61fd-4899-a2ea-07e6277f8c3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.519937 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-config-data" (OuterVolumeSpecName: "config-data") pod "aaf8dae4-61fd-4899-a2ea-07e6277f8c3f" (UID: "aaf8dae4-61fd-4899-a2ea-07e6277f8c3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.536836 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaf8dae4-61fd-4899-a2ea-07e6277f8c3f" (UID: "aaf8dae4-61fd-4899-a2ea-07e6277f8c3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.593102 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.593143 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd9wp\" (UniqueName: \"kubernetes.io/projected/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-kube-api-access-gd9wp\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.593162 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.593175 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:02 crc kubenswrapper[4848]: I1206 15:51:02.976952 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e6a9170-ec3f-4704-8526-7736f298a496" path="/var/lib/kubelet/pods/5e6a9170-ec3f-4704-8526-7736f298a496/volumes" Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.059675 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8k5vr" event={"ID":"aaf8dae4-61fd-4899-a2ea-07e6277f8c3f","Type":"ContainerDied","Data":"a1410770e66756e7c0910d86f61303202e6559ec56e043e4e56e2913505c3c1f"} Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.059729 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1410770e66756e7c0910d86f61303202e6559ec56e043e4e56e2913505c3c1f" Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.059896 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8k5vr" Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.212949 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.213202 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="38c0cf0f-a169-4585-9922-00437f53db61" containerName="nova-api-log" containerID="cri-o://e9675ba1442276ce233e746b947635987da20566e460835ead1c80f0f3f2b249" gracePeriod=30 Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.213308 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="38c0cf0f-a169-4585-9922-00437f53db61" containerName="nova-api-api" containerID="cri-o://a0360a4059ffa229c437af1d3beae964e528311b5d83e103137c08020182a97e" gracePeriod=30 Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.231686 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.250570 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.250845 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7b6fda35-6156-4081-bcec-6315368c6ea9" containerName="nova-metadata-log" containerID="cri-o://6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd" gracePeriod=30 Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.250917 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7b6fda35-6156-4081-bcec-6315368c6ea9" containerName="nova-metadata-metadata" containerID="cri-o://fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9" gracePeriod=30 Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.847365 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.915596 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5l6n\" (UniqueName: \"kubernetes.io/projected/7b6fda35-6156-4081-bcec-6315368c6ea9-kube-api-access-l5l6n\") pod \"7b6fda35-6156-4081-bcec-6315368c6ea9\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.915756 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b6fda35-6156-4081-bcec-6315368c6ea9-logs\") pod \"7b6fda35-6156-4081-bcec-6315368c6ea9\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.915831 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-config-data\") pod \"7b6fda35-6156-4081-bcec-6315368c6ea9\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.915923 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-combined-ca-bundle\") pod \"7b6fda35-6156-4081-bcec-6315368c6ea9\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.916022 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b6fda35-6156-4081-bcec-6315368c6ea9-logs" (OuterVolumeSpecName: "logs") pod "7b6fda35-6156-4081-bcec-6315368c6ea9" (UID: "7b6fda35-6156-4081-bcec-6315368c6ea9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.916801 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-nova-metadata-tls-certs\") pod \"7b6fda35-6156-4081-bcec-6315368c6ea9\" (UID: \"7b6fda35-6156-4081-bcec-6315368c6ea9\") " Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.917383 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b6fda35-6156-4081-bcec-6315368c6ea9-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.920336 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b6fda35-6156-4081-bcec-6315368c6ea9-kube-api-access-l5l6n" (OuterVolumeSpecName: "kube-api-access-l5l6n") pod "7b6fda35-6156-4081-bcec-6315368c6ea9" (UID: "7b6fda35-6156-4081-bcec-6315368c6ea9"). InnerVolumeSpecName "kube-api-access-l5l6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.944032 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b6fda35-6156-4081-bcec-6315368c6ea9" (UID: "7b6fda35-6156-4081-bcec-6315368c6ea9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.962721 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-config-data" (OuterVolumeSpecName: "config-data") pod "7b6fda35-6156-4081-bcec-6315368c6ea9" (UID: "7b6fda35-6156-4081-bcec-6315368c6ea9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:03 crc kubenswrapper[4848]: I1206 15:51:03.982210 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7b6fda35-6156-4081-bcec-6315368c6ea9" (UID: "7b6fda35-6156-4081-bcec-6315368c6ea9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.019429 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.019508 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.019563 4848 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b6fda35-6156-4081-bcec-6315368c6ea9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.019579 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5l6n\" (UniqueName: \"kubernetes.io/projected/7b6fda35-6156-4081-bcec-6315368c6ea9-kube-api-access-l5l6n\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.093053 4848 generic.go:334] "Generic (PLEG): container finished" podID="7b6fda35-6156-4081-bcec-6315368c6ea9" containerID="fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9" exitCode=0 Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.093091 4848 generic.go:334] "Generic (PLEG): container finished" podID="7b6fda35-6156-4081-bcec-6315368c6ea9" containerID="6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd" exitCode=143 Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.093159 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.093159 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b6fda35-6156-4081-bcec-6315368c6ea9","Type":"ContainerDied","Data":"fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9"} Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.093225 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b6fda35-6156-4081-bcec-6315368c6ea9","Type":"ContainerDied","Data":"6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd"} Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.093240 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7b6fda35-6156-4081-bcec-6315368c6ea9","Type":"ContainerDied","Data":"5ee7f46c6fdad0970458ef2eb7e8abae5d5cd3c558628c724bd44cb2584cfcee"} Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.093259 4848 scope.go:117] "RemoveContainer" containerID="fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.097023 4848 generic.go:334] "Generic (PLEG): container finished" podID="d719bb86-9c8a-47a5-9b01-010f0ac07dac" containerID="f6e1852ba42c8d6702ad8fb8a52d9026c79f96bf5ac698a3ad19be532e4185de" exitCode=0 Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.097064 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8hbwm" event={"ID":"d719bb86-9c8a-47a5-9b01-010f0ac07dac","Type":"ContainerDied","Data":"f6e1852ba42c8d6702ad8fb8a52d9026c79f96bf5ac698a3ad19be532e4185de"} Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.099153 4848 generic.go:334] "Generic (PLEG): container finished" podID="38c0cf0f-a169-4585-9922-00437f53db61" containerID="e9675ba1442276ce233e746b947635987da20566e460835ead1c80f0f3f2b249" exitCode=143 Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.099259 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38c0cf0f-a169-4585-9922-00437f53db61","Type":"ContainerDied","Data":"e9675ba1442276ce233e746b947635987da20566e460835ead1c80f0f3f2b249"} Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.099300 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="13bdfbbb-3bd3-41ad-8cf7-7c2825244824" containerName="nova-scheduler-scheduler" containerID="cri-o://4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7" gracePeriod=30 Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.127845 4848 scope.go:117] "RemoveContainer" containerID="6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.157474 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.159456 4848 scope.go:117] "RemoveContainer" containerID="fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9" Dec 06 15:51:04 crc kubenswrapper[4848]: E1206 15:51:04.160288 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9\": container with ID starting with fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9 not found: ID does not exist" containerID="fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.160384 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9"} err="failed to get container status \"fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9\": rpc error: code = NotFound desc = could not find container \"fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9\": container with ID starting with fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9 not found: ID does not exist" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.160459 4848 scope.go:117] "RemoveContainer" containerID="6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd" Dec 06 15:51:04 crc kubenswrapper[4848]: E1206 15:51:04.160856 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd\": container with ID starting with 6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd not found: ID does not exist" containerID="6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.160890 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd"} err="failed to get container status \"6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd\": rpc error: code = NotFound desc = could not find container \"6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd\": container with ID starting with 6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd not found: ID does not exist" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.160906 4848 scope.go:117] "RemoveContainer" containerID="fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.161139 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9"} err="failed to get container status \"fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9\": rpc error: code = NotFound desc = could not find container \"fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9\": container with ID starting with fb9eb04782c2db4204f09391957940d995dacbda68f0f699a67205f86bea6de9 not found: ID does not exist" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.161158 4848 scope.go:117] "RemoveContainer" containerID="6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.161366 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd"} err="failed to get container status \"6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd\": rpc error: code = NotFound desc = could not find container \"6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd\": container with ID starting with 6210a50cfb3108a1181d1fcbb43c7c564b7b1c315b2356b0bb3d2a7394534ccd not found: ID does not exist" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.170121 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.192235 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:04 crc kubenswrapper[4848]: E1206 15:51:04.192867 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6a9170-ec3f-4704-8526-7736f298a496" containerName="init" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.192990 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6a9170-ec3f-4704-8526-7736f298a496" containerName="init" Dec 06 15:51:04 crc kubenswrapper[4848]: E1206 15:51:04.193083 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6fda35-6156-4081-bcec-6315368c6ea9" containerName="nova-metadata-log" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.193165 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6fda35-6156-4081-bcec-6315368c6ea9" containerName="nova-metadata-log" Dec 06 15:51:04 crc kubenswrapper[4848]: E1206 15:51:04.193257 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf8dae4-61fd-4899-a2ea-07e6277f8c3f" containerName="nova-manage" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.193328 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf8dae4-61fd-4899-a2ea-07e6277f8c3f" containerName="nova-manage" Dec 06 15:51:04 crc kubenswrapper[4848]: E1206 15:51:04.193419 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6fda35-6156-4081-bcec-6315368c6ea9" containerName="nova-metadata-metadata" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.193494 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6fda35-6156-4081-bcec-6315368c6ea9" containerName="nova-metadata-metadata" Dec 06 15:51:04 crc kubenswrapper[4848]: E1206 15:51:04.193571 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6a9170-ec3f-4704-8526-7736f298a496" containerName="dnsmasq-dns" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.193674 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6a9170-ec3f-4704-8526-7736f298a496" containerName="dnsmasq-dns" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.193983 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaf8dae4-61fd-4899-a2ea-07e6277f8c3f" containerName="nova-manage" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.194086 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e6a9170-ec3f-4704-8526-7736f298a496" containerName="dnsmasq-dns" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.194159 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6fda35-6156-4081-bcec-6315368c6ea9" containerName="nova-metadata-metadata" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.194237 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6fda35-6156-4081-bcec-6315368c6ea9" containerName="nova-metadata-log" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.195627 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.197615 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.197655 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.204577 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.328235 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-config-data\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.328301 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq4nv\" (UniqueName: \"kubernetes.io/projected/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-kube-api-access-lq4nv\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.328359 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.328383 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.328494 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-logs\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.430236 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.430292 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.430374 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-logs\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.430457 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-config-data\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.430517 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq4nv\" (UniqueName: \"kubernetes.io/projected/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-kube-api-access-lq4nv\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.431269 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-logs\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.434937 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-config-data\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.435272 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.435430 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.446883 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq4nv\" (UniqueName: \"kubernetes.io/projected/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-kube-api-access-lq4nv\") pod \"nova-metadata-0\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.517494 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:51:04 crc kubenswrapper[4848]: W1206 15:51:04.973879 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice/crio-d5947b2ef4a7b9b8d80fbec5660243fa3ce2422d61650322fe2daa2aea0d6851 WatchSource:0}: Error finding container d5947b2ef4a7b9b8d80fbec5660243fa3ce2422d61650322fe2daa2aea0d6851: Status 404 returned error can't find the container with id d5947b2ef4a7b9b8d80fbec5660243fa3ce2422d61650322fe2daa2aea0d6851 Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.982633 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b6fda35-6156-4081-bcec-6315368c6ea9" path="/var/lib/kubelet/pods/7b6fda35-6156-4081-bcec-6315368c6ea9/volumes" Dec 06 15:51:04 crc kubenswrapper[4848]: I1206 15:51:04.984095 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.109386 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc","Type":"ContainerStarted","Data":"d5947b2ef4a7b9b8d80fbec5660243fa3ce2422d61650322fe2daa2aea0d6851"} Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.362266 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.447541 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-scripts\") pod \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.447829 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nq9m\" (UniqueName: \"kubernetes.io/projected/d719bb86-9c8a-47a5-9b01-010f0ac07dac-kube-api-access-4nq9m\") pod \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.447863 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-combined-ca-bundle\") pod \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.447923 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-config-data\") pod \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\" (UID: \"d719bb86-9c8a-47a5-9b01-010f0ac07dac\") " Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.451574 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-scripts" (OuterVolumeSpecName: "scripts") pod "d719bb86-9c8a-47a5-9b01-010f0ac07dac" (UID: "d719bb86-9c8a-47a5-9b01-010f0ac07dac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.451944 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d719bb86-9c8a-47a5-9b01-010f0ac07dac-kube-api-access-4nq9m" (OuterVolumeSpecName: "kube-api-access-4nq9m") pod "d719bb86-9c8a-47a5-9b01-010f0ac07dac" (UID: "d719bb86-9c8a-47a5-9b01-010f0ac07dac"). InnerVolumeSpecName "kube-api-access-4nq9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.479385 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-config-data" (OuterVolumeSpecName: "config-data") pod "d719bb86-9c8a-47a5-9b01-010f0ac07dac" (UID: "d719bb86-9c8a-47a5-9b01-010f0ac07dac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.481007 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d719bb86-9c8a-47a5-9b01-010f0ac07dac" (UID: "d719bb86-9c8a-47a5-9b01-010f0ac07dac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.551373 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.551420 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.551437 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nq9m\" (UniqueName: \"kubernetes.io/projected/d719bb86-9c8a-47a5-9b01-010f0ac07dac-kube-api-access-4nq9m\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:05 crc kubenswrapper[4848]: I1206 15:51:05.551487 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d719bb86-9c8a-47a5-9b01-010f0ac07dac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.123098 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8hbwm" event={"ID":"d719bb86-9c8a-47a5-9b01-010f0ac07dac","Type":"ContainerDied","Data":"9f3176b572353df94600c9701b12dd24e9bbcba39ab08dd82502008f9f761560"} Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.123405 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f3176b572353df94600c9701b12dd24e9bbcba39ab08dd82502008f9f761560" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.123111 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8hbwm" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.125196 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc","Type":"ContainerStarted","Data":"a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd"} Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.125242 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc","Type":"ContainerStarted","Data":"05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119"} Dec 06 15:51:06 crc kubenswrapper[4848]: E1206 15:51:06.201755 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 15:51:06 crc kubenswrapper[4848]: E1206 15:51:06.205449 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.205837 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.205819031 podStartE2EDuration="2.205819031s" podCreationTimestamp="2025-12-06 15:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:06.150058858 +0000 UTC m=+1333.448069771" watchObservedRunningTime="2025-12-06 15:51:06.205819031 +0000 UTC m=+1333.503829934" Dec 06 15:51:06 crc kubenswrapper[4848]: E1206 15:51:06.207120 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 15:51:06 crc kubenswrapper[4848]: E1206 15:51:06.207260 4848 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="13bdfbbb-3bd3-41ad-8cf7-7c2825244824" containerName="nova-scheduler-scheduler" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.233385 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 15:51:06 crc kubenswrapper[4848]: E1206 15:51:06.233929 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d719bb86-9c8a-47a5-9b01-010f0ac07dac" containerName="nova-cell1-conductor-db-sync" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.233947 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d719bb86-9c8a-47a5-9b01-010f0ac07dac" containerName="nova-cell1-conductor-db-sync" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.234193 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="d719bb86-9c8a-47a5-9b01-010f0ac07dac" containerName="nova-cell1-conductor-db-sync" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.235482 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.240965 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.264529 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40340eae-e441-4326-b678-265f2cd36d20-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"40340eae-e441-4326-b678-265f2cd36d20\") " pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.264758 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40340eae-e441-4326-b678-265f2cd36d20-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"40340eae-e441-4326-b678-265f2cd36d20\") " pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.264795 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lg2x\" (UniqueName: \"kubernetes.io/projected/40340eae-e441-4326-b678-265f2cd36d20-kube-api-access-9lg2x\") pod \"nova-cell1-conductor-0\" (UID: \"40340eae-e441-4326-b678-265f2cd36d20\") " pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.271541 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.366632 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40340eae-e441-4326-b678-265f2cd36d20-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"40340eae-e441-4326-b678-265f2cd36d20\") " pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.366685 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lg2x\" (UniqueName: \"kubernetes.io/projected/40340eae-e441-4326-b678-265f2cd36d20-kube-api-access-9lg2x\") pod \"nova-cell1-conductor-0\" (UID: \"40340eae-e441-4326-b678-265f2cd36d20\") " pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.367168 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40340eae-e441-4326-b678-265f2cd36d20-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"40340eae-e441-4326-b678-265f2cd36d20\") " pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.378309 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40340eae-e441-4326-b678-265f2cd36d20-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"40340eae-e441-4326-b678-265f2cd36d20\") " pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.383151 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lg2x\" (UniqueName: \"kubernetes.io/projected/40340eae-e441-4326-b678-265f2cd36d20-kube-api-access-9lg2x\") pod \"nova-cell1-conductor-0\" (UID: \"40340eae-e441-4326-b678-265f2cd36d20\") " pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.383207 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40340eae-e441-4326-b678-265f2cd36d20-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"40340eae-e441-4326-b678-265f2cd36d20\") " pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:06 crc kubenswrapper[4848]: I1206 15:51:06.574871 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:07 crc kubenswrapper[4848]: I1206 15:51:07.019298 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 15:51:07 crc kubenswrapper[4848]: W1206 15:51:07.029327 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40340eae_e441_4326_b678_265f2cd36d20.slice/crio-c4bd0cbd09034182a6e11e6eeabd2474f26d41118ea280afb10aaded302be4b3 WatchSource:0}: Error finding container c4bd0cbd09034182a6e11e6eeabd2474f26d41118ea280afb10aaded302be4b3: Status 404 returned error can't find the container with id c4bd0cbd09034182a6e11e6eeabd2474f26d41118ea280afb10aaded302be4b3 Dec 06 15:51:07 crc kubenswrapper[4848]: I1206 15:51:07.134524 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"40340eae-e441-4326-b678-265f2cd36d20","Type":"ContainerStarted","Data":"c4bd0cbd09034182a6e11e6eeabd2474f26d41118ea280afb10aaded302be4b3"} Dec 06 15:51:07 crc kubenswrapper[4848]: I1206 15:51:07.964062 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.145517 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"40340eae-e441-4326-b678-265f2cd36d20","Type":"ContainerStarted","Data":"7453a9072c2be7731c1b1a246a87bc8d4270175f631db8a76f6be27fb9b8c0ac"} Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.146587 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.148848 4848 generic.go:334] "Generic (PLEG): container finished" podID="38c0cf0f-a169-4585-9922-00437f53db61" containerID="a0360a4059ffa229c437af1d3beae964e528311b5d83e103137c08020182a97e" exitCode=0 Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.148865 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38c0cf0f-a169-4585-9922-00437f53db61","Type":"ContainerDied","Data":"a0360a4059ffa229c437af1d3beae964e528311b5d83e103137c08020182a97e"} Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.173181 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.17316162 podStartE2EDuration="2.17316162s" podCreationTimestamp="2025-12-06 15:51:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:08.171174087 +0000 UTC m=+1335.469185020" watchObservedRunningTime="2025-12-06 15:51:08.17316162 +0000 UTC m=+1335.471172533" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.616237 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.715511 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-config-data\") pod \"38c0cf0f-a169-4585-9922-00437f53db61\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.715913 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38c0cf0f-a169-4585-9922-00437f53db61-logs\") pod \"38c0cf0f-a169-4585-9922-00437f53db61\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.716042 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k89xj\" (UniqueName: \"kubernetes.io/projected/38c0cf0f-a169-4585-9922-00437f53db61-kube-api-access-k89xj\") pod \"38c0cf0f-a169-4585-9922-00437f53db61\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.716205 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-combined-ca-bundle\") pod \"38c0cf0f-a169-4585-9922-00437f53db61\" (UID: \"38c0cf0f-a169-4585-9922-00437f53db61\") " Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.717059 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38c0cf0f-a169-4585-9922-00437f53db61-logs" (OuterVolumeSpecName: "logs") pod "38c0cf0f-a169-4585-9922-00437f53db61" (UID: "38c0cf0f-a169-4585-9922-00437f53db61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.722607 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38c0cf0f-a169-4585-9922-00437f53db61-kube-api-access-k89xj" (OuterVolumeSpecName: "kube-api-access-k89xj") pod "38c0cf0f-a169-4585-9922-00437f53db61" (UID: "38c0cf0f-a169-4585-9922-00437f53db61"). InnerVolumeSpecName "kube-api-access-k89xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.744942 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-config-data" (OuterVolumeSpecName: "config-data") pod "38c0cf0f-a169-4585-9922-00437f53db61" (UID: "38c0cf0f-a169-4585-9922-00437f53db61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.753047 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38c0cf0f-a169-4585-9922-00437f53db61" (UID: "38c0cf0f-a169-4585-9922-00437f53db61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.811289 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.818433 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.818646 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38c0cf0f-a169-4585-9922-00437f53db61-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.818814 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38c0cf0f-a169-4585-9922-00437f53db61-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.818893 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k89xj\" (UniqueName: \"kubernetes.io/projected/38c0cf0f-a169-4585-9922-00437f53db61-kube-api-access-k89xj\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.919965 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-combined-ca-bundle\") pod \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.920140 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krmt9\" (UniqueName: \"kubernetes.io/projected/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-kube-api-access-krmt9\") pod \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.920168 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-config-data\") pod \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.925181 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-kube-api-access-krmt9" (OuterVolumeSpecName: "kube-api-access-krmt9") pod "13bdfbbb-3bd3-41ad-8cf7-7c2825244824" (UID: "13bdfbbb-3bd3-41ad-8cf7-7c2825244824"). InnerVolumeSpecName "kube-api-access-krmt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:08 crc kubenswrapper[4848]: E1206 15:51:08.948679 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-config-data podName:13bdfbbb-3bd3-41ad-8cf7-7c2825244824 nodeName:}" failed. No retries permitted until 2025-12-06 15:51:09.44865025 +0000 UTC m=+1336.746661163 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-config-data") pod "13bdfbbb-3bd3-41ad-8cf7-7c2825244824" (UID: "13bdfbbb-3bd3-41ad-8cf7-7c2825244824") : error deleting /var/lib/kubelet/pods/13bdfbbb-3bd3-41ad-8cf7-7c2825244824/volume-subpaths: remove /var/lib/kubelet/pods/13bdfbbb-3bd3-41ad-8cf7-7c2825244824/volume-subpaths: no such file or directory Dec 06 15:51:08 crc kubenswrapper[4848]: I1206 15:51:08.960053 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13bdfbbb-3bd3-41ad-8cf7-7c2825244824" (UID: "13bdfbbb-3bd3-41ad-8cf7-7c2825244824"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.024138 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.024181 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krmt9\" (UniqueName: \"kubernetes.io/projected/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-kube-api-access-krmt9\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.157816 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38c0cf0f-a169-4585-9922-00437f53db61","Type":"ContainerDied","Data":"3b35df775fbe6d9f2362ab2e0226e24f0606becebb2369979ea5e37de51f776a"} Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.157866 4848 scope.go:117] "RemoveContainer" containerID="a0360a4059ffa229c437af1d3beae964e528311b5d83e103137c08020182a97e" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.158026 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.162534 4848 generic.go:334] "Generic (PLEG): container finished" podID="13bdfbbb-3bd3-41ad-8cf7-7c2825244824" containerID="4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7" exitCode=0 Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.163805 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.164089 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"13bdfbbb-3bd3-41ad-8cf7-7c2825244824","Type":"ContainerDied","Data":"4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7"} Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.164192 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"13bdfbbb-3bd3-41ad-8cf7-7c2825244824","Type":"ContainerDied","Data":"b44b400d000ccf3c84ed255041940eb8b8ac0ec70357c8c098b8ba94319b561d"} Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.182987 4848 scope.go:117] "RemoveContainer" containerID="e9675ba1442276ce233e746b947635987da20566e460835ead1c80f0f3f2b249" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.203217 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.223666 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.223824 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:09 crc kubenswrapper[4848]: E1206 15:51:09.224277 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38c0cf0f-a169-4585-9922-00437f53db61" containerName="nova-api-api" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.224363 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c0cf0f-a169-4585-9922-00437f53db61" containerName="nova-api-api" Dec 06 15:51:09 crc kubenswrapper[4848]: E1206 15:51:09.224445 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38c0cf0f-a169-4585-9922-00437f53db61" containerName="nova-api-log" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.224501 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c0cf0f-a169-4585-9922-00437f53db61" containerName="nova-api-log" Dec 06 15:51:09 crc kubenswrapper[4848]: E1206 15:51:09.224577 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13bdfbbb-3bd3-41ad-8cf7-7c2825244824" containerName="nova-scheduler-scheduler" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.224639 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="13bdfbbb-3bd3-41ad-8cf7-7c2825244824" containerName="nova-scheduler-scheduler" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.224893 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="38c0cf0f-a169-4585-9922-00437f53db61" containerName="nova-api-api" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.224968 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="38c0cf0f-a169-4585-9922-00437f53db61" containerName="nova-api-log" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.225041 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="13bdfbbb-3bd3-41ad-8cf7-7c2825244824" containerName="nova-scheduler-scheduler" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.237549 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.237867 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.246653 4848 scope.go:117] "RemoveContainer" containerID="4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.246890 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.289756 4848 scope.go:117] "RemoveContainer" containerID="4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7" Dec 06 15:51:09 crc kubenswrapper[4848]: E1206 15:51:09.290406 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7\": container with ID starting with 4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7 not found: ID does not exist" containerID="4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.290449 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7"} err="failed to get container status \"4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7\": rpc error: code = NotFound desc = could not find container \"4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7\": container with ID starting with 4241d94c03b97f5968a300ba4e6441318517547120e38d9767435279b7dbb5c7 not found: ID does not exist" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.332284 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc53fe-8f16-4991-9f98-83ee00d20244-logs\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.332584 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.332888 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-config-data\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.333079 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkszh\" (UniqueName: \"kubernetes.io/projected/7fcc53fe-8f16-4991-9f98-83ee00d20244-kube-api-access-nkszh\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.434900 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc53fe-8f16-4991-9f98-83ee00d20244-logs\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.435606 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.436017 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-config-data\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.436155 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkszh\" (UniqueName: \"kubernetes.io/projected/7fcc53fe-8f16-4991-9f98-83ee00d20244-kube-api-access-nkszh\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.435534 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc53fe-8f16-4991-9f98-83ee00d20244-logs\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.442454 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-config-data\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.443363 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.454830 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkszh\" (UniqueName: \"kubernetes.io/projected/7fcc53fe-8f16-4991-9f98-83ee00d20244-kube-api-access-nkszh\") pod \"nova-api-0\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.518348 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.518404 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.537183 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-config-data\") pod \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\" (UID: \"13bdfbbb-3bd3-41ad-8cf7-7c2825244824\") " Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.540410 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-config-data" (OuterVolumeSpecName: "config-data") pod "13bdfbbb-3bd3-41ad-8cf7-7c2825244824" (UID: "13bdfbbb-3bd3-41ad-8cf7-7c2825244824"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.584573 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.639142 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bdfbbb-3bd3-41ad-8cf7-7c2825244824-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.809943 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.822468 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.833782 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.835130 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.837739 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.854283 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.944217 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-config-data\") pod \"nova-scheduler-0\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.944281 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:09 crc kubenswrapper[4848]: I1206 15:51:09.944385 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m5kz\" (UniqueName: \"kubernetes.io/projected/fb793c37-98f3-42b2-be40-3778672cb7d6-kube-api-access-5m5kz\") pod \"nova-scheduler-0\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.047012 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-config-data\") pod \"nova-scheduler-0\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.047136 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.047299 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m5kz\" (UniqueName: \"kubernetes.io/projected/fb793c37-98f3-42b2-be40-3778672cb7d6-kube-api-access-5m5kz\") pod \"nova-scheduler-0\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.052556 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-config-data\") pod \"nova-scheduler-0\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.052959 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.065705 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m5kz\" (UniqueName: \"kubernetes.io/projected/fb793c37-98f3-42b2-be40-3778672cb7d6-kube-api-access-5m5kz\") pod \"nova-scheduler-0\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.068360 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.150291 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.174675 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fcc53fe-8f16-4991-9f98-83ee00d20244","Type":"ContainerStarted","Data":"8775b7de7776a9db2e90e11b19f1943b5527e9fb3683b5ab1c7e465f0dee786a"} Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.634839 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.977996 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13bdfbbb-3bd3-41ad-8cf7-7c2825244824" path="/var/lib/kubelet/pods/13bdfbbb-3bd3-41ad-8cf7-7c2825244824/volumes" Dec 06 15:51:10 crc kubenswrapper[4848]: I1206 15:51:10.979204 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38c0cf0f-a169-4585-9922-00437f53db61" path="/var/lib/kubelet/pods/38c0cf0f-a169-4585-9922-00437f53db61/volumes" Dec 06 15:51:11 crc kubenswrapper[4848]: I1206 15:51:11.184798 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb793c37-98f3-42b2-be40-3778672cb7d6","Type":"ContainerStarted","Data":"49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378"} Dec 06 15:51:11 crc kubenswrapper[4848]: I1206 15:51:11.184854 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb793c37-98f3-42b2-be40-3778672cb7d6","Type":"ContainerStarted","Data":"ee19a670ee9ba6055042c69f5209b91792d579f8c982b6342f451580c1872ea1"} Dec 06 15:51:11 crc kubenswrapper[4848]: I1206 15:51:11.193903 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fcc53fe-8f16-4991-9f98-83ee00d20244","Type":"ContainerStarted","Data":"d826aca8307b94dd27c38288b71ba7b4e0a69a465d297665c0f60ab60cbdf4b1"} Dec 06 15:51:11 crc kubenswrapper[4848]: I1206 15:51:11.193957 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fcc53fe-8f16-4991-9f98-83ee00d20244","Type":"ContainerStarted","Data":"d13c0676bddb16e11c0e1e2cfa6661d8152296db2fa6ab094e60a3d25107ed53"} Dec 06 15:51:11 crc kubenswrapper[4848]: I1206 15:51:11.209582 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.209563819 podStartE2EDuration="2.209563819s" podCreationTimestamp="2025-12-06 15:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:11.200791672 +0000 UTC m=+1338.498802595" watchObservedRunningTime="2025-12-06 15:51:11.209563819 +0000 UTC m=+1338.507574732" Dec 06 15:51:11 crc kubenswrapper[4848]: I1206 15:51:11.223352 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.22333525 podStartE2EDuration="2.22333525s" podCreationTimestamp="2025-12-06 15:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:11.219155967 +0000 UTC m=+1338.517166881" watchObservedRunningTime="2025-12-06 15:51:11.22333525 +0000 UTC m=+1338.521346163" Dec 06 15:51:11 crc kubenswrapper[4848]: I1206 15:51:11.828766 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 15:51:11 crc kubenswrapper[4848]: I1206 15:51:11.829023 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e3135a6e-f317-4bd7-84dd-788f65fc87a0" containerName="kube-state-metrics" containerID="cri-o://4b46177a45d811e1bd3adc40ee2a136ca05f1ba1780b017404fe23ff2e72d937" gracePeriod=30 Dec 06 15:51:12 crc kubenswrapper[4848]: I1206 15:51:12.222812 4848 generic.go:334] "Generic (PLEG): container finished" podID="e3135a6e-f317-4bd7-84dd-788f65fc87a0" containerID="4b46177a45d811e1bd3adc40ee2a136ca05f1ba1780b017404fe23ff2e72d937" exitCode=2 Dec 06 15:51:12 crc kubenswrapper[4848]: I1206 15:51:12.223437 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e3135a6e-f317-4bd7-84dd-788f65fc87a0","Type":"ContainerDied","Data":"4b46177a45d811e1bd3adc40ee2a136ca05f1ba1780b017404fe23ff2e72d937"} Dec 06 15:51:12 crc kubenswrapper[4848]: I1206 15:51:12.355747 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 15:51:12 crc kubenswrapper[4848]: I1206 15:51:12.517957 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j47pk\" (UniqueName: \"kubernetes.io/projected/e3135a6e-f317-4bd7-84dd-788f65fc87a0-kube-api-access-j47pk\") pod \"e3135a6e-f317-4bd7-84dd-788f65fc87a0\" (UID: \"e3135a6e-f317-4bd7-84dd-788f65fc87a0\") " Dec 06 15:51:12 crc kubenswrapper[4848]: I1206 15:51:12.527110 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3135a6e-f317-4bd7-84dd-788f65fc87a0-kube-api-access-j47pk" (OuterVolumeSpecName: "kube-api-access-j47pk") pod "e3135a6e-f317-4bd7-84dd-788f65fc87a0" (UID: "e3135a6e-f317-4bd7-84dd-788f65fc87a0"). InnerVolumeSpecName "kube-api-access-j47pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:12 crc kubenswrapper[4848]: I1206 15:51:12.619889 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j47pk\" (UniqueName: \"kubernetes.io/projected/e3135a6e-f317-4bd7-84dd-788f65fc87a0-kube-api-access-j47pk\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.234193 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e3135a6e-f317-4bd7-84dd-788f65fc87a0","Type":"ContainerDied","Data":"ef63aa2111c7d56a75863302b8aa03ba21b70b5591d23fdb14c85de8ad81c276"} Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.234293 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.234537 4848 scope.go:117] "RemoveContainer" containerID="4b46177a45d811e1bd3adc40ee2a136ca05f1ba1780b017404fe23ff2e72d937" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.264050 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.275271 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.286254 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 15:51:13 crc kubenswrapper[4848]: E1206 15:51:13.286885 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3135a6e-f317-4bd7-84dd-788f65fc87a0" containerName="kube-state-metrics" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.286908 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3135a6e-f317-4bd7-84dd-788f65fc87a0" containerName="kube-state-metrics" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.287120 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3135a6e-f317-4bd7-84dd-788f65fc87a0" containerName="kube-state-metrics" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.287765 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.292715 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.292719 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.311381 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.436178 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cgtq\" (UniqueName: \"kubernetes.io/projected/e9aa43e5-ee37-49dc-8278-f2018f524c42-kube-api-access-5cgtq\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.436308 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9aa43e5-ee37-49dc-8278-f2018f524c42-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.436373 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e9aa43e5-ee37-49dc-8278-f2018f524c42-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.437142 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9aa43e5-ee37-49dc-8278-f2018f524c42-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.539601 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9aa43e5-ee37-49dc-8278-f2018f524c42-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.540064 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cgtq\" (UniqueName: \"kubernetes.io/projected/e9aa43e5-ee37-49dc-8278-f2018f524c42-kube-api-access-5cgtq\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.540232 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9aa43e5-ee37-49dc-8278-f2018f524c42-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.540375 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e9aa43e5-ee37-49dc-8278-f2018f524c42-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.545595 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9aa43e5-ee37-49dc-8278-f2018f524c42-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.546492 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e9aa43e5-ee37-49dc-8278-f2018f524c42-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.547719 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9aa43e5-ee37-49dc-8278-f2018f524c42-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.564877 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cgtq\" (UniqueName: \"kubernetes.io/projected/e9aa43e5-ee37-49dc-8278-f2018f524c42-kube-api-access-5cgtq\") pod \"kube-state-metrics-0\" (UID: \"e9aa43e5-ee37-49dc-8278-f2018f524c42\") " pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.618989 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.787691 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.788286 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="ceilometer-central-agent" containerID="cri-o://ec02952907aa06b02890acf47dfd39efa4bdd7ad974ba0c683bea2066623db01" gracePeriod=30 Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.789499 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="proxy-httpd" containerID="cri-o://adfad597cc41baeb5d080038455d5937394910bfeae368e714cd413683773790" gracePeriod=30 Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.789578 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="sg-core" containerID="cri-o://6f1d0ee1f506c31e0d0460bd13963b1647513852d3905cdc837f805c0d6f432b" gracePeriod=30 Dec 06 15:51:13 crc kubenswrapper[4848]: I1206 15:51:13.789620 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="ceilometer-notification-agent" containerID="cri-o://023d2db23ce2bb90a41ac0c248dfd419e5a789661b3ba84fb98446a3a3fb7085" gracePeriod=30 Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.083274 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 15:51:14 crc kubenswrapper[4848]: W1206 15:51:14.088210 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9aa43e5_ee37_49dc_8278_f2018f524c42.slice/crio-84e25cc8df5e23592e69eeb0455f5aa897927c0a195b3e5fbc3ec89eb8bfe63e WatchSource:0}: Error finding container 84e25cc8df5e23592e69eeb0455f5aa897927c0a195b3e5fbc3ec89eb8bfe63e: Status 404 returned error can't find the container with id 84e25cc8df5e23592e69eeb0455f5aa897927c0a195b3e5fbc3ec89eb8bfe63e Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.091969 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.244866 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e9aa43e5-ee37-49dc-8278-f2018f524c42","Type":"ContainerStarted","Data":"84e25cc8df5e23592e69eeb0455f5aa897927c0a195b3e5fbc3ec89eb8bfe63e"} Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.247428 4848 generic.go:334] "Generic (PLEG): container finished" podID="fb85487f-67a6-4596-86b5-5c6f18797527" containerID="adfad597cc41baeb5d080038455d5937394910bfeae368e714cd413683773790" exitCode=0 Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.247451 4848 generic.go:334] "Generic (PLEG): container finished" podID="fb85487f-67a6-4596-86b5-5c6f18797527" containerID="6f1d0ee1f506c31e0d0460bd13963b1647513852d3905cdc837f805c0d6f432b" exitCode=2 Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.247458 4848 generic.go:334] "Generic (PLEG): container finished" podID="fb85487f-67a6-4596-86b5-5c6f18797527" containerID="ec02952907aa06b02890acf47dfd39efa4bdd7ad974ba0c683bea2066623db01" exitCode=0 Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.247461 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb85487f-67a6-4596-86b5-5c6f18797527","Type":"ContainerDied","Data":"adfad597cc41baeb5d080038455d5937394910bfeae368e714cd413683773790"} Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.247500 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb85487f-67a6-4596-86b5-5c6f18797527","Type":"ContainerDied","Data":"6f1d0ee1f506c31e0d0460bd13963b1647513852d3905cdc837f805c0d6f432b"} Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.247513 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb85487f-67a6-4596-86b5-5c6f18797527","Type":"ContainerDied","Data":"ec02952907aa06b02890acf47dfd39efa4bdd7ad974ba0c683bea2066623db01"} Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.522640 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.523019 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 15:51:14 crc kubenswrapper[4848]: I1206 15:51:14.983000 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3135a6e-f317-4bd7-84dd-788f65fc87a0" path="/var/lib/kubelet/pods/e3135a6e-f317-4bd7-84dd-788f65fc87a0/volumes" Dec 06 15:51:15 crc kubenswrapper[4848]: I1206 15:51:15.151402 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 15:51:15 crc kubenswrapper[4848]: I1206 15:51:15.258464 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e9aa43e5-ee37-49dc-8278-f2018f524c42","Type":"ContainerStarted","Data":"452ce66fb8362e18b87255e6d9fa1b84a0d9b318586502da862311bd74e6460a"} Dec 06 15:51:15 crc kubenswrapper[4848]: I1206 15:51:15.289870 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.9225119560000001 podStartE2EDuration="2.289852935s" podCreationTimestamp="2025-12-06 15:51:13 +0000 UTC" firstStartedPulling="2025-12-06 15:51:14.091642414 +0000 UTC m=+1341.389653337" lastFinishedPulling="2025-12-06 15:51:14.458983403 +0000 UTC m=+1341.756994316" observedRunningTime="2025-12-06 15:51:15.274778739 +0000 UTC m=+1342.572789672" watchObservedRunningTime="2025-12-06 15:51:15.289852935 +0000 UTC m=+1342.587863848" Dec 06 15:51:15 crc kubenswrapper[4848]: I1206 15:51:15.567273 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 15:51:15 crc kubenswrapper[4848]: I1206 15:51:15.567270 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 15:51:16 crc kubenswrapper[4848]: I1206 15:51:16.270522 4848 generic.go:334] "Generic (PLEG): container finished" podID="fb85487f-67a6-4596-86b5-5c6f18797527" containerID="023d2db23ce2bb90a41ac0c248dfd419e5a789661b3ba84fb98446a3a3fb7085" exitCode=0 Dec 06 15:51:16 crc kubenswrapper[4848]: I1206 15:51:16.270720 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb85487f-67a6-4596-86b5-5c6f18797527","Type":"ContainerDied","Data":"023d2db23ce2bb90a41ac0c248dfd419e5a789661b3ba84fb98446a3a3fb7085"} Dec 06 15:51:16 crc kubenswrapper[4848]: I1206 15:51:16.271115 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 15:51:16 crc kubenswrapper[4848]: I1206 15:51:16.602764 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 06 15:51:16 crc kubenswrapper[4848]: I1206 15:51:16.982015 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.011606 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-sg-core-conf-yaml\") pod \"fb85487f-67a6-4596-86b5-5c6f18797527\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.011681 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-scripts\") pod \"fb85487f-67a6-4596-86b5-5c6f18797527\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.011798 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z4c7\" (UniqueName: \"kubernetes.io/projected/fb85487f-67a6-4596-86b5-5c6f18797527-kube-api-access-8z4c7\") pod \"fb85487f-67a6-4596-86b5-5c6f18797527\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.011848 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-run-httpd\") pod \"fb85487f-67a6-4596-86b5-5c6f18797527\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.011877 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-log-httpd\") pod \"fb85487f-67a6-4596-86b5-5c6f18797527\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.052071 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fb85487f-67a6-4596-86b5-5c6f18797527" (UID: "fb85487f-67a6-4596-86b5-5c6f18797527"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.053468 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fb85487f-67a6-4596-86b5-5c6f18797527" (UID: "fb85487f-67a6-4596-86b5-5c6f18797527"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.055608 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb85487f-67a6-4596-86b5-5c6f18797527-kube-api-access-8z4c7" (OuterVolumeSpecName: "kube-api-access-8z4c7") pod "fb85487f-67a6-4596-86b5-5c6f18797527" (UID: "fb85487f-67a6-4596-86b5-5c6f18797527"). InnerVolumeSpecName "kube-api-access-8z4c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.058469 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-scripts" (OuterVolumeSpecName: "scripts") pod "fb85487f-67a6-4596-86b5-5c6f18797527" (UID: "fb85487f-67a6-4596-86b5-5c6f18797527"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.058852 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fb85487f-67a6-4596-86b5-5c6f18797527" (UID: "fb85487f-67a6-4596-86b5-5c6f18797527"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.112932 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-config-data\") pod \"fb85487f-67a6-4596-86b5-5c6f18797527\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.113265 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-combined-ca-bundle\") pod \"fb85487f-67a6-4596-86b5-5c6f18797527\" (UID: \"fb85487f-67a6-4596-86b5-5c6f18797527\") " Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.113629 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.113644 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.113655 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z4c7\" (UniqueName: \"kubernetes.io/projected/fb85487f-67a6-4596-86b5-5c6f18797527-kube-api-access-8z4c7\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.113665 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.113674 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb85487f-67a6-4596-86b5-5c6f18797527-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.200927 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb85487f-67a6-4596-86b5-5c6f18797527" (UID: "fb85487f-67a6-4596-86b5-5c6f18797527"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.210686 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-config-data" (OuterVolumeSpecName: "config-data") pod "fb85487f-67a6-4596-86b5-5c6f18797527" (UID: "fb85487f-67a6-4596-86b5-5c6f18797527"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.214905 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.214936 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb85487f-67a6-4596-86b5-5c6f18797527-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.286409 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.286991 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb85487f-67a6-4596-86b5-5c6f18797527","Type":"ContainerDied","Data":"c7b3b227c3dd44ec1b7f92194a3cc785115f4a9e56714991fb6d2d4a8eae9af0"} Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.288028 4848 scope.go:117] "RemoveContainer" containerID="adfad597cc41baeb5d080038455d5937394910bfeae368e714cd413683773790" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.313724 4848 scope.go:117] "RemoveContainer" containerID="6f1d0ee1f506c31e0d0460bd13963b1647513852d3905cdc837f805c0d6f432b" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.331572 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.338218 4848 scope.go:117] "RemoveContainer" containerID="023d2db23ce2bb90a41ac0c248dfd419e5a789661b3ba84fb98446a3a3fb7085" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.339722 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.369411 4848 scope.go:117] "RemoveContainer" containerID="ec02952907aa06b02890acf47dfd39efa4bdd7ad974ba0c683bea2066623db01" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.374859 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:17 crc kubenswrapper[4848]: E1206 15:51:17.375406 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="proxy-httpd" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.375428 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="proxy-httpd" Dec 06 15:51:17 crc kubenswrapper[4848]: E1206 15:51:17.375466 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="ceilometer-central-agent" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.375476 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="ceilometer-central-agent" Dec 06 15:51:17 crc kubenswrapper[4848]: E1206 15:51:17.375493 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="ceilometer-notification-agent" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.375501 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="ceilometer-notification-agent" Dec 06 15:51:17 crc kubenswrapper[4848]: E1206 15:51:17.375518 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="sg-core" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.375526 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="sg-core" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.375783 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="proxy-httpd" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.375811 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="ceilometer-notification-agent" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.375823 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="sg-core" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.375845 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" containerName="ceilometer-central-agent" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.378050 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.382867 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.385053 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.385236 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.386793 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.423421 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.423487 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.423509 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-scripts\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.423541 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnbdn\" (UniqueName: \"kubernetes.io/projected/5599ab9d-578e-4159-b876-e0d1ad467905-kube-api-access-xnbdn\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.423587 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-log-httpd\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.423622 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-config-data\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.423652 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.423669 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-run-httpd\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.525324 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-log-httpd\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.525397 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-config-data\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.525437 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.525460 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-run-httpd\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.525514 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.525564 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.525587 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-scripts\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.525636 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnbdn\" (UniqueName: \"kubernetes.io/projected/5599ab9d-578e-4159-b876-e0d1ad467905-kube-api-access-xnbdn\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.526429 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-run-httpd\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.526777 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-log-httpd\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.530234 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.530682 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-config-data\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.531557 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-scripts\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.536612 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.538046 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.551213 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnbdn\" (UniqueName: \"kubernetes.io/projected/5599ab9d-578e-4159-b876-e0d1ad467905-kube-api-access-xnbdn\") pod \"ceilometer-0\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " pod="openstack/ceilometer-0" Dec 06 15:51:17 crc kubenswrapper[4848]: I1206 15:51:17.693426 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:51:18 crc kubenswrapper[4848]: I1206 15:51:18.209327 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:18 crc kubenswrapper[4848]: I1206 15:51:18.299799 4848 generic.go:334] "Generic (PLEG): container finished" podID="ab198686-7839-4e39-abdb-ea9b65893a02" containerID="b0da0f6e4da0be9213843fe02b4ec52ec72e4fffb1cb7b2e734d96ea462bd20b" exitCode=0 Dec 06 15:51:18 crc kubenswrapper[4848]: I1206 15:51:18.299876 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"ab198686-7839-4e39-abdb-ea9b65893a02","Type":"ContainerDied","Data":"b0da0f6e4da0be9213843fe02b4ec52ec72e4fffb1cb7b2e734d96ea462bd20b"} Dec 06 15:51:18 crc kubenswrapper[4848]: I1206 15:51:18.302827 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5599ab9d-578e-4159-b876-e0d1ad467905","Type":"ContainerStarted","Data":"612d3c7d4f011191e51b3f8809909d16c0617520eaa67692e4a7b87d479a1f12"} Dec 06 15:51:18 crc kubenswrapper[4848]: I1206 15:51:18.983633 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb85487f-67a6-4596-86b5-5c6f18797527" path="/var/lib/kubelet/pods/fb85487f-67a6-4596-86b5-5c6f18797527/volumes" Dec 06 15:51:19 crc kubenswrapper[4848]: I1206 15:51:19.319222 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5599ab9d-578e-4159-b876-e0d1ad467905","Type":"ContainerStarted","Data":"cd44e169e935af357be7bd7e466c1895af6e1e964b2324c0b3561f6fb9f07752"} Dec 06 15:51:19 crc kubenswrapper[4848]: I1206 15:51:19.323059 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"ab198686-7839-4e39-abdb-ea9b65893a02","Type":"ContainerStarted","Data":"b03d208ca2865f0e78106310190fd04b684898935feabe295b7a51660005626e"} Dec 06 15:51:19 crc kubenswrapper[4848]: I1206 15:51:19.323085 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"ab198686-7839-4e39-abdb-ea9b65893a02","Type":"ContainerStarted","Data":"041f8bee65b337e06a277adfe6cf0ce6cea1b7364e3e5ceed77d0ba3cb7a04b0"} Dec 06 15:51:19 crc kubenswrapper[4848]: I1206 15:51:19.585665 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 15:51:19 crc kubenswrapper[4848]: I1206 15:51:19.585765 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 15:51:20 crc kubenswrapper[4848]: I1206 15:51:20.151346 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 15:51:20 crc kubenswrapper[4848]: I1206 15:51:20.183793 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 15:51:20 crc kubenswrapper[4848]: I1206 15:51:20.336552 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5599ab9d-578e-4159-b876-e0d1ad467905","Type":"ContainerStarted","Data":"ff58a4bad97d41d6e96c0e1e04d88ad9cc86b5d7651f15322229e617add19ab7"} Dec 06 15:51:20 crc kubenswrapper[4848]: I1206 15:51:20.345636 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"ab198686-7839-4e39-abdb-ea9b65893a02","Type":"ContainerStarted","Data":"423153cea57183d34f32b8f749e18e251d362dd8cb88b86fb6473d0b58c5eaf4"} Dec 06 15:51:20 crc kubenswrapper[4848]: I1206 15:51:20.346033 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Dec 06 15:51:20 crc kubenswrapper[4848]: I1206 15:51:20.390108 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=-9223371925.464693 podStartE2EDuration="1m51.390082185s" podCreationTimestamp="2025-12-06 15:49:29 +0000 UTC" firstStartedPulling="2025-12-06 15:49:34.252181784 +0000 UTC m=+1241.550192707" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:20.382180982 +0000 UTC m=+1347.680191915" watchObservedRunningTime="2025-12-06 15:51:20.390082185 +0000 UTC m=+1347.688093098" Dec 06 15:51:20 crc kubenswrapper[4848]: I1206 15:51:20.411171 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 15:51:20 crc kubenswrapper[4848]: I1206 15:51:20.626978 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 15:51:20 crc kubenswrapper[4848]: I1206 15:51:20.627034 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 15:51:21 crc kubenswrapper[4848]: I1206 15:51:21.203935 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-conductor-0" Dec 06 15:51:21 crc kubenswrapper[4848]: I1206 15:51:21.356183 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5599ab9d-578e-4159-b876-e0d1ad467905","Type":"ContainerStarted","Data":"c759262a6455e528967b6b30f3b46a19641a2eff3df8e06403cb637038fbd9d2"} Dec 06 15:51:22 crc kubenswrapper[4848]: I1206 15:51:22.541838 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-conductor-0" Dec 06 15:51:23 crc kubenswrapper[4848]: I1206 15:51:23.377598 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5599ab9d-578e-4159-b876-e0d1ad467905","Type":"ContainerStarted","Data":"175521c6e79a9a6c821230b4aa817ef9a9fefb01ae011617c544c9ee4752336d"} Dec 06 15:51:23 crc kubenswrapper[4848]: I1206 15:51:23.377924 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 15:51:23 crc kubenswrapper[4848]: I1206 15:51:23.431031 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.933082722 podStartE2EDuration="6.431009774s" podCreationTimestamp="2025-12-06 15:51:17 +0000 UTC" firstStartedPulling="2025-12-06 15:51:18.216823641 +0000 UTC m=+1345.514834554" lastFinishedPulling="2025-12-06 15:51:22.714750693 +0000 UTC m=+1350.012761606" observedRunningTime="2025-12-06 15:51:23.421334413 +0000 UTC m=+1350.719345316" watchObservedRunningTime="2025-12-06 15:51:23.431009774 +0000 UTC m=+1350.729020687" Dec 06 15:51:23 crc kubenswrapper[4848]: I1206 15:51:23.628207 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 15:51:24 crc kubenswrapper[4848]: I1206 15:51:24.525636 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 15:51:24 crc kubenswrapper[4848]: I1206 15:51:24.530201 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 15:51:24 crc kubenswrapper[4848]: I1206 15:51:24.534576 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 15:51:25 crc kubenswrapper[4848]: I1206 15:51:25.408347 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.334975 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.420257 4848 generic.go:334] "Generic (PLEG): container finished" podID="54827637-22f6-42b6-afe6-bbb21ae65924" containerID="f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a" exitCode=137 Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.420314 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.420371 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"54827637-22f6-42b6-afe6-bbb21ae65924","Type":"ContainerDied","Data":"f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a"} Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.420462 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"54827637-22f6-42b6-afe6-bbb21ae65924","Type":"ContainerDied","Data":"2f47f9bfba5961e9702e7308d1c27943ea17a137eb6ff7656f7d55db6a72dc03"} Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.420486 4848 scope.go:117] "RemoveContainer" containerID="f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.448450 4848 scope.go:117] "RemoveContainer" containerID="f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a" Dec 06 15:51:27 crc kubenswrapper[4848]: E1206 15:51:27.449097 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a\": container with ID starting with f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a not found: ID does not exist" containerID="f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.449274 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a"} err="failed to get container status \"f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a\": rpc error: code = NotFound desc = could not find container \"f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a\": container with ID starting with f98d011a1b420bf7a8c32ead193fe8d41e9e030f12b4f5ea28402e6c82448a6a not found: ID does not exist" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.468096 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-config-data\") pod \"54827637-22f6-42b6-afe6-bbb21ae65924\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.468161 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsxc6\" (UniqueName: \"kubernetes.io/projected/54827637-22f6-42b6-afe6-bbb21ae65924-kube-api-access-nsxc6\") pod \"54827637-22f6-42b6-afe6-bbb21ae65924\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.468475 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-combined-ca-bundle\") pod \"54827637-22f6-42b6-afe6-bbb21ae65924\" (UID: \"54827637-22f6-42b6-afe6-bbb21ae65924\") " Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.501308 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54827637-22f6-42b6-afe6-bbb21ae65924-kube-api-access-nsxc6" (OuterVolumeSpecName: "kube-api-access-nsxc6") pod "54827637-22f6-42b6-afe6-bbb21ae65924" (UID: "54827637-22f6-42b6-afe6-bbb21ae65924"). InnerVolumeSpecName "kube-api-access-nsxc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.505353 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-config-data" (OuterVolumeSpecName: "config-data") pod "54827637-22f6-42b6-afe6-bbb21ae65924" (UID: "54827637-22f6-42b6-afe6-bbb21ae65924"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.507167 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54827637-22f6-42b6-afe6-bbb21ae65924" (UID: "54827637-22f6-42b6-afe6-bbb21ae65924"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.570681 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.570734 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54827637-22f6-42b6-afe6-bbb21ae65924-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.570743 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsxc6\" (UniqueName: \"kubernetes.io/projected/54827637-22f6-42b6-afe6-bbb21ae65924-kube-api-access-nsxc6\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.768994 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.781748 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.796428 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 15:51:27 crc kubenswrapper[4848]: E1206 15:51:27.796867 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54827637-22f6-42b6-afe6-bbb21ae65924" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.796893 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="54827637-22f6-42b6-afe6-bbb21ae65924" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.797189 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="54827637-22f6-42b6-afe6-bbb21ae65924" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.797829 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.801095 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.801210 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.802130 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.805744 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.877035 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqwg7\" (UniqueName: \"kubernetes.io/projected/789dd8b9-2530-46a7-b5ee-7276afa689fb-kube-api-access-cqwg7\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.877149 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.877197 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.877221 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.877291 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.979437 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.979492 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.979571 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.979671 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqwg7\" (UniqueName: \"kubernetes.io/projected/789dd8b9-2530-46a7-b5ee-7276afa689fb-kube-api-access-cqwg7\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.979746 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.983415 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.983488 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.984178 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:27 crc kubenswrapper[4848]: I1206 15:51:27.984256 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/789dd8b9-2530-46a7-b5ee-7276afa689fb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:28 crc kubenswrapper[4848]: I1206 15:51:28.002736 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqwg7\" (UniqueName: \"kubernetes.io/projected/789dd8b9-2530-46a7-b5ee-7276afa689fb-kube-api-access-cqwg7\") pod \"nova-cell1-novncproxy-0\" (UID: \"789dd8b9-2530-46a7-b5ee-7276afa689fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:28 crc kubenswrapper[4848]: I1206 15:51:28.117659 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:28 crc kubenswrapper[4848]: I1206 15:51:28.600071 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 15:51:28 crc kubenswrapper[4848]: I1206 15:51:28.977546 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54827637-22f6-42b6-afe6-bbb21ae65924" path="/var/lib/kubelet/pods/54827637-22f6-42b6-afe6-bbb21ae65924/volumes" Dec 06 15:51:29 crc kubenswrapper[4848]: I1206 15:51:29.457330 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"789dd8b9-2530-46a7-b5ee-7276afa689fb","Type":"ContainerStarted","Data":"180c28cf3ad6cfc1e7e9063f603254f6f077bc36141b32b81687fd778e4a0592"} Dec 06 15:51:29 crc kubenswrapper[4848]: I1206 15:51:29.457666 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"789dd8b9-2530-46a7-b5ee-7276afa689fb","Type":"ContainerStarted","Data":"444147c9de492ab59d7769eeeb59064d2ec1b373e1a1d9aae20915b1446dfed6"} Dec 06 15:51:29 crc kubenswrapper[4848]: I1206 15:51:29.488096 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.488058174 podStartE2EDuration="2.488058174s" podCreationTimestamp="2025-12-06 15:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:29.475645519 +0000 UTC m=+1356.773656452" watchObservedRunningTime="2025-12-06 15:51:29.488058174 +0000 UTC m=+1356.786069087" Dec 06 15:51:29 crc kubenswrapper[4848]: I1206 15:51:29.592023 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 15:51:29 crc kubenswrapper[4848]: I1206 15:51:29.592732 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 15:51:29 crc kubenswrapper[4848]: I1206 15:51:29.593180 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 15:51:29 crc kubenswrapper[4848]: I1206 15:51:29.599914 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.466222 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.470170 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.615931 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-qpndw"] Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.618502 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.638048 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-qpndw"] Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.738815 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.739166 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.739195 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.739356 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-config\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.739595 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.739734 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsl8n\" (UniqueName: \"kubernetes.io/projected/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-kube-api-access-vsl8n\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.841154 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsl8n\" (UniqueName: \"kubernetes.io/projected/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-kube-api-access-vsl8n\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.841249 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.841322 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.841345 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.841407 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-config\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.841498 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.842744 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.842801 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.842814 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-config\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.842959 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.842999 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.872073 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsl8n\" (UniqueName: \"kubernetes.io/projected/0fbe788b-df4c-456d-a2d3-b64abbf62ac7-kube-api-access-vsl8n\") pod \"dnsmasq-dns-5c7b6c5df9-qpndw\" (UID: \"0fbe788b-df4c-456d-a2d3-b64abbf62ac7\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:30 crc kubenswrapper[4848]: I1206 15:51:30.969010 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:31 crc kubenswrapper[4848]: I1206 15:51:31.212065 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Dec 06 15:51:31 crc kubenswrapper[4848]: I1206 15:51:31.446575 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-qpndw"] Dec 06 15:51:31 crc kubenswrapper[4848]: W1206 15:51:31.448816 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fbe788b_df4c_456d_a2d3_b64abbf62ac7.slice/crio-f904d4f50db9b31f97453e7029dd52155f09b3e928733316964244752105fb22 WatchSource:0}: Error finding container f904d4f50db9b31f97453e7029dd52155f09b3e928733316964244752105fb22: Status 404 returned error can't find the container with id f904d4f50db9b31f97453e7029dd52155f09b3e928733316964244752105fb22 Dec 06 15:51:31 crc kubenswrapper[4848]: I1206 15:51:31.475901 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" event={"ID":"0fbe788b-df4c-456d-a2d3-b64abbf62ac7","Type":"ContainerStarted","Data":"f904d4f50db9b31f97453e7029dd52155f09b3e928733316964244752105fb22"} Dec 06 15:51:32 crc kubenswrapper[4848]: I1206 15:51:32.487349 4848 generic.go:334] "Generic (PLEG): container finished" podID="0fbe788b-df4c-456d-a2d3-b64abbf62ac7" containerID="353a44e87497678d0390456e39bef803919bfd3b01df1ea41fed5358bcd04c51" exitCode=0 Dec 06 15:51:32 crc kubenswrapper[4848]: I1206 15:51:32.487460 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" event={"ID":"0fbe788b-df4c-456d-a2d3-b64abbf62ac7","Type":"ContainerDied","Data":"353a44e87497678d0390456e39bef803919bfd3b01df1ea41fed5358bcd04c51"} Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.119329 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.124192 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.124546 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="ceilometer-central-agent" containerID="cri-o://cd44e169e935af357be7bd7e466c1895af6e1e964b2324c0b3561f6fb9f07752" gracePeriod=30 Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.124666 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="ceilometer-notification-agent" containerID="cri-o://ff58a4bad97d41d6e96c0e1e04d88ad9cc86b5d7651f15322229e617add19ab7" gracePeriod=30 Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.124635 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="proxy-httpd" containerID="cri-o://175521c6e79a9a6c821230b4aa817ef9a9fefb01ae011617c544c9ee4752336d" gracePeriod=30 Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.124746 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="sg-core" containerID="cri-o://c759262a6455e528967b6b30f3b46a19641a2eff3df8e06403cb637038fbd9d2" gracePeriod=30 Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.131010 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.204:3000/\": EOF" Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.265226 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.499851 4848 generic.go:334] "Generic (PLEG): container finished" podID="5599ab9d-578e-4159-b876-e0d1ad467905" containerID="175521c6e79a9a6c821230b4aa817ef9a9fefb01ae011617c544c9ee4752336d" exitCode=0 Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.499891 4848 generic.go:334] "Generic (PLEG): container finished" podID="5599ab9d-578e-4159-b876-e0d1ad467905" containerID="c759262a6455e528967b6b30f3b46a19641a2eff3df8e06403cb637038fbd9d2" exitCode=2 Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.499943 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5599ab9d-578e-4159-b876-e0d1ad467905","Type":"ContainerDied","Data":"175521c6e79a9a6c821230b4aa817ef9a9fefb01ae011617c544c9ee4752336d"} Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.499993 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5599ab9d-578e-4159-b876-e0d1ad467905","Type":"ContainerDied","Data":"c759262a6455e528967b6b30f3b46a19641a2eff3df8e06403cb637038fbd9d2"} Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.502019 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerName="nova-api-log" containerID="cri-o://d13c0676bddb16e11c0e1e2cfa6661d8152296db2fa6ab094e60a3d25107ed53" gracePeriod=30 Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.503190 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" event={"ID":"0fbe788b-df4c-456d-a2d3-b64abbf62ac7","Type":"ContainerStarted","Data":"2117cd5486d7ec9d3599baff9dbd2c522c9bc8fcb6a9be23a30c50b9dd62fdb5"} Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.503228 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.503605 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerName="nova-api-api" containerID="cri-o://d826aca8307b94dd27c38288b71ba7b4e0a69a465d297665c0f60ab60cbdf4b1" gracePeriod=30 Dec 06 15:51:33 crc kubenswrapper[4848]: I1206 15:51:33.521758 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" podStartSLOduration=3.521737134 podStartE2EDuration="3.521737134s" podCreationTimestamp="2025-12-06 15:51:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:33.519895273 +0000 UTC m=+1360.817906186" watchObservedRunningTime="2025-12-06 15:51:33.521737134 +0000 UTC m=+1360.819748057" Dec 06 15:51:34 crc kubenswrapper[4848]: I1206 15:51:34.512407 4848 generic.go:334] "Generic (PLEG): container finished" podID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerID="d13c0676bddb16e11c0e1e2cfa6661d8152296db2fa6ab094e60a3d25107ed53" exitCode=143 Dec 06 15:51:34 crc kubenswrapper[4848]: I1206 15:51:34.512477 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fcc53fe-8f16-4991-9f98-83ee00d20244","Type":"ContainerDied","Data":"d13c0676bddb16e11c0e1e2cfa6661d8152296db2fa6ab094e60a3d25107ed53"} Dec 06 15:51:34 crc kubenswrapper[4848]: I1206 15:51:34.515176 4848 generic.go:334] "Generic (PLEG): container finished" podID="5599ab9d-578e-4159-b876-e0d1ad467905" containerID="cd44e169e935af357be7bd7e466c1895af6e1e964b2324c0b3561f6fb9f07752" exitCode=0 Dec 06 15:51:34 crc kubenswrapper[4848]: I1206 15:51:34.515213 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5599ab9d-578e-4159-b876-e0d1ad467905","Type":"ContainerDied","Data":"cd44e169e935af357be7bd7e466c1895af6e1e964b2324c0b3561f6fb9f07752"} Dec 06 15:51:36 crc kubenswrapper[4848]: I1206 15:51:36.534518 4848 generic.go:334] "Generic (PLEG): container finished" podID="5599ab9d-578e-4159-b876-e0d1ad467905" containerID="ff58a4bad97d41d6e96c0e1e04d88ad9cc86b5d7651f15322229e617add19ab7" exitCode=0 Dec 06 15:51:36 crc kubenswrapper[4848]: I1206 15:51:36.534589 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5599ab9d-578e-4159-b876-e0d1ad467905","Type":"ContainerDied","Data":"ff58a4bad97d41d6e96c0e1e04d88ad9cc86b5d7651f15322229e617add19ab7"} Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.280082 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.386478 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-combined-ca-bundle\") pod \"5599ab9d-578e-4159-b876-e0d1ad467905\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.386594 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-config-data\") pod \"5599ab9d-578e-4159-b876-e0d1ad467905\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.386630 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-scripts\") pod \"5599ab9d-578e-4159-b876-e0d1ad467905\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.386802 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-log-httpd\") pod \"5599ab9d-578e-4159-b876-e0d1ad467905\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.386874 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-sg-core-conf-yaml\") pod \"5599ab9d-578e-4159-b876-e0d1ad467905\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.386987 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-ceilometer-tls-certs\") pod \"5599ab9d-578e-4159-b876-e0d1ad467905\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.387057 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-run-httpd\") pod \"5599ab9d-578e-4159-b876-e0d1ad467905\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.387180 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnbdn\" (UniqueName: \"kubernetes.io/projected/5599ab9d-578e-4159-b876-e0d1ad467905-kube-api-access-xnbdn\") pod \"5599ab9d-578e-4159-b876-e0d1ad467905\" (UID: \"5599ab9d-578e-4159-b876-e0d1ad467905\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.387930 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5599ab9d-578e-4159-b876-e0d1ad467905" (UID: "5599ab9d-578e-4159-b876-e0d1ad467905"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.388073 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5599ab9d-578e-4159-b876-e0d1ad467905" (UID: "5599ab9d-578e-4159-b876-e0d1ad467905"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.388532 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.388655 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5599ab9d-578e-4159-b876-e0d1ad467905-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.396522 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-scripts" (OuterVolumeSpecName: "scripts") pod "5599ab9d-578e-4159-b876-e0d1ad467905" (UID: "5599ab9d-578e-4159-b876-e0d1ad467905"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.399430 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5599ab9d-578e-4159-b876-e0d1ad467905-kube-api-access-xnbdn" (OuterVolumeSpecName: "kube-api-access-xnbdn") pod "5599ab9d-578e-4159-b876-e0d1ad467905" (UID: "5599ab9d-578e-4159-b876-e0d1ad467905"). InnerVolumeSpecName "kube-api-access-xnbdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.452728 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5599ab9d-578e-4159-b876-e0d1ad467905" (UID: "5599ab9d-578e-4159-b876-e0d1ad467905"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.463543 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5599ab9d-578e-4159-b876-e0d1ad467905" (UID: "5599ab9d-578e-4159-b876-e0d1ad467905"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.487441 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5599ab9d-578e-4159-b876-e0d1ad467905" (UID: "5599ab9d-578e-4159-b876-e0d1ad467905"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.490299 4848 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.490335 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnbdn\" (UniqueName: \"kubernetes.io/projected/5599ab9d-578e-4159-b876-e0d1ad467905-kube-api-access-xnbdn\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.490346 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.490354 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.490362 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.510429 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-config-data" (OuterVolumeSpecName: "config-data") pod "5599ab9d-578e-4159-b876-e0d1ad467905" (UID: "5599ab9d-578e-4159-b876-e0d1ad467905"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.547189 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5599ab9d-578e-4159-b876-e0d1ad467905","Type":"ContainerDied","Data":"612d3c7d4f011191e51b3f8809909d16c0617520eaa67692e4a7b87d479a1f12"} Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.548137 4848 scope.go:117] "RemoveContainer" containerID="175521c6e79a9a6c821230b4aa817ef9a9fefb01ae011617c544c9ee4752336d" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.547522 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.553349 4848 generic.go:334] "Generic (PLEG): container finished" podID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerID="d826aca8307b94dd27c38288b71ba7b4e0a69a465d297665c0f60ab60cbdf4b1" exitCode=0 Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.553444 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fcc53fe-8f16-4991-9f98-83ee00d20244","Type":"ContainerDied","Data":"d826aca8307b94dd27c38288b71ba7b4e0a69a465d297665c0f60ab60cbdf4b1"} Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.553785 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fcc53fe-8f16-4991-9f98-83ee00d20244","Type":"ContainerDied","Data":"8775b7de7776a9db2e90e11b19f1943b5527e9fb3683b5ab1c7e465f0dee786a"} Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.553869 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8775b7de7776a9db2e90e11b19f1943b5527e9fb3683b5ab1c7e465f0dee786a" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.583382 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.598364 4848 scope.go:117] "RemoveContainer" containerID="c759262a6455e528967b6b30f3b46a19641a2eff3df8e06403cb637038fbd9d2" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.598375 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5599ab9d-578e-4159-b876-e0d1ad467905-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.601688 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.614512 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.630564 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:37 crc kubenswrapper[4848]: E1206 15:51:37.631321 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerName="nova-api-api" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631344 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerName="nova-api-api" Dec 06 15:51:37 crc kubenswrapper[4848]: E1206 15:51:37.631361 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="proxy-httpd" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631369 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="proxy-httpd" Dec 06 15:51:37 crc kubenswrapper[4848]: E1206 15:51:37.631384 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerName="nova-api-log" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631392 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerName="nova-api-log" Dec 06 15:51:37 crc kubenswrapper[4848]: E1206 15:51:37.631409 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="sg-core" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631417 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="sg-core" Dec 06 15:51:37 crc kubenswrapper[4848]: E1206 15:51:37.631444 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="ceilometer-notification-agent" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631452 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="ceilometer-notification-agent" Dec 06 15:51:37 crc kubenswrapper[4848]: E1206 15:51:37.631468 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="ceilometer-central-agent" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631475 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="ceilometer-central-agent" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631755 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="ceilometer-central-agent" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631782 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="proxy-httpd" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631799 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="ceilometer-notification-agent" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631817 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" containerName="sg-core" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631834 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerName="nova-api-api" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.631850 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fcc53fe-8f16-4991-9f98-83ee00d20244" containerName="nova-api-log" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.636933 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.641722 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.641747 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.642400 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.650599 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.674134 4848 scope.go:117] "RemoveContainer" containerID="ff58a4bad97d41d6e96c0e1e04d88ad9cc86b5d7651f15322229e617add19ab7" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.699604 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkszh\" (UniqueName: \"kubernetes.io/projected/7fcc53fe-8f16-4991-9f98-83ee00d20244-kube-api-access-nkszh\") pod \"7fcc53fe-8f16-4991-9f98-83ee00d20244\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.699683 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-config-data\") pod \"7fcc53fe-8f16-4991-9f98-83ee00d20244\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.699813 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-combined-ca-bundle\") pod \"7fcc53fe-8f16-4991-9f98-83ee00d20244\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.699989 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc53fe-8f16-4991-9f98-83ee00d20244-logs\") pod \"7fcc53fe-8f16-4991-9f98-83ee00d20244\" (UID: \"7fcc53fe-8f16-4991-9f98-83ee00d20244\") " Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.700327 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a0084de-2a42-4cd7-a5ce-67c1770870b2-log-httpd\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.700368 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-config-data\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.700429 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.700469 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkqkq\" (UniqueName: \"kubernetes.io/projected/6a0084de-2a42-4cd7-a5ce-67c1770870b2-kube-api-access-bkqkq\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.700530 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-scripts\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.700587 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.700664 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a0084de-2a42-4cd7-a5ce-67c1770870b2-run-httpd\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.700738 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.701722 4848 scope.go:117] "RemoveContainer" containerID="cd44e169e935af357be7bd7e466c1895af6e1e964b2324c0b3561f6fb9f07752" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.702419 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fcc53fe-8f16-4991-9f98-83ee00d20244-logs" (OuterVolumeSpecName: "logs") pod "7fcc53fe-8f16-4991-9f98-83ee00d20244" (UID: "7fcc53fe-8f16-4991-9f98-83ee00d20244"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.704344 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fcc53fe-8f16-4991-9f98-83ee00d20244-kube-api-access-nkszh" (OuterVolumeSpecName: "kube-api-access-nkszh") pod "7fcc53fe-8f16-4991-9f98-83ee00d20244" (UID: "7fcc53fe-8f16-4991-9f98-83ee00d20244"). InnerVolumeSpecName "kube-api-access-nkszh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.743008 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-config-data" (OuterVolumeSpecName: "config-data") pod "7fcc53fe-8f16-4991-9f98-83ee00d20244" (UID: "7fcc53fe-8f16-4991-9f98-83ee00d20244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.752411 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fcc53fe-8f16-4991-9f98-83ee00d20244" (UID: "7fcc53fe-8f16-4991-9f98-83ee00d20244"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.802763 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.802866 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a0084de-2a42-4cd7-a5ce-67c1770870b2-run-httpd\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.802928 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.803004 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a0084de-2a42-4cd7-a5ce-67c1770870b2-log-httpd\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.803032 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-config-data\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.803103 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.803157 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkqkq\" (UniqueName: \"kubernetes.io/projected/6a0084de-2a42-4cd7-a5ce-67c1770870b2-kube-api-access-bkqkq\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.803202 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-scripts\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.803279 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkszh\" (UniqueName: \"kubernetes.io/projected/7fcc53fe-8f16-4991-9f98-83ee00d20244-kube-api-access-nkszh\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.803316 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.803326 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fcc53fe-8f16-4991-9f98-83ee00d20244-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.803335 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc53fe-8f16-4991-9f98-83ee00d20244-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.804094 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a0084de-2a42-4cd7-a5ce-67c1770870b2-log-httpd\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.804243 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a0084de-2a42-4cd7-a5ce-67c1770870b2-run-httpd\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.809567 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.810086 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-scripts\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.810812 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.811093 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.812625 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0084de-2a42-4cd7-a5ce-67c1770870b2-config-data\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.821437 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkqkq\" (UniqueName: \"kubernetes.io/projected/6a0084de-2a42-4cd7-a5ce-67c1770870b2-kube-api-access-bkqkq\") pod \"ceilometer-0\" (UID: \"6a0084de-2a42-4cd7-a5ce-67c1770870b2\") " pod="openstack/ceilometer-0" Dec 06 15:51:37 crc kubenswrapper[4848]: I1206 15:51:37.969561 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.119057 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.140388 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.449778 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 15:51:38 crc kubenswrapper[4848]: W1206 15:51:38.456070 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a0084de_2a42_4cd7_a5ce_67c1770870b2.slice/crio-730715c082ec352ff77b2925cf25f3daa05889589c7bf6431d04e34236ce82ba WatchSource:0}: Error finding container 730715c082ec352ff77b2925cf25f3daa05889589c7bf6431d04e34236ce82ba: Status 404 returned error can't find the container with id 730715c082ec352ff77b2925cf25f3daa05889589c7bf6431d04e34236ce82ba Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.562740 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a0084de-2a42-4cd7-a5ce-67c1770870b2","Type":"ContainerStarted","Data":"730715c082ec352ff77b2925cf25f3daa05889589c7bf6431d04e34236ce82ba"} Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.563953 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.582319 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.600256 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.613535 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.652969 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.661327 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.664252 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.664928 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.665093 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.676156 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.722766 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-public-tls-certs\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.722848 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-config-data\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.722883 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-logs\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.722913 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.722947 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c55fc\" (UniqueName: \"kubernetes.io/projected/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-kube-api-access-c55fc\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.722972 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.764442 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-n4hw7"] Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.766161 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.769323 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.769521 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.775988 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-n4hw7"] Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.824710 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.824782 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-logs\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.824837 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.824886 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-scripts\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.824929 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c55fc\" (UniqueName: \"kubernetes.io/projected/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-kube-api-access-c55fc\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.824967 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.825038 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgrkx\" (UniqueName: \"kubernetes.io/projected/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-kube-api-access-mgrkx\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.825059 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-config-data\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.825086 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-public-tls-certs\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.825150 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-config-data\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.825245 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-logs\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.830907 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-config-data\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.831322 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.832970 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-public-tls-certs\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.833263 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.848344 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c55fc\" (UniqueName: \"kubernetes.io/projected/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-kube-api-access-c55fc\") pod \"nova-api-0\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " pod="openstack/nova-api-0" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.930026 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgrkx\" (UniqueName: \"kubernetes.io/projected/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-kube-api-access-mgrkx\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.930406 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-config-data\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.930526 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.930596 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-scripts\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.934670 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.937612 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-config-data\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.937822 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-scripts\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.954097 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgrkx\" (UniqueName: \"kubernetes.io/projected/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-kube-api-access-mgrkx\") pod \"nova-cell1-cell-mapping-n4hw7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.980583 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5599ab9d-578e-4159-b876-e0d1ad467905" path="/var/lib/kubelet/pods/5599ab9d-578e-4159-b876-e0d1ad467905/volumes" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.981451 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fcc53fe-8f16-4991-9f98-83ee00d20244" path="/var/lib/kubelet/pods/7fcc53fe-8f16-4991-9f98-83ee00d20244/volumes" Dec 06 15:51:38 crc kubenswrapper[4848]: I1206 15:51:38.992039 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:39 crc kubenswrapper[4848]: I1206 15:51:39.085290 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:39 crc kubenswrapper[4848]: I1206 15:51:39.442122 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:39 crc kubenswrapper[4848]: W1206 15:51:39.449208 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b9eb3e0_070e_4129_804a_6a7e68fb0f66.slice/crio-b030a3930a257a44a62b3b39b02054e17b397cdd0725c8d17879836d30bdc09a WatchSource:0}: Error finding container b030a3930a257a44a62b3b39b02054e17b397cdd0725c8d17879836d30bdc09a: Status 404 returned error can't find the container with id b030a3930a257a44a62b3b39b02054e17b397cdd0725c8d17879836d30bdc09a Dec 06 15:51:39 crc kubenswrapper[4848]: I1206 15:51:39.603989 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a0084de-2a42-4cd7-a5ce-67c1770870b2","Type":"ContainerStarted","Data":"0659062d1d135984dfa016e857abc33761d1f50a73102bd034acdad13d6fc954"} Dec 06 15:51:39 crc kubenswrapper[4848]: I1206 15:51:39.608012 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b9eb3e0-070e-4129-804a-6a7e68fb0f66","Type":"ContainerStarted","Data":"b030a3930a257a44a62b3b39b02054e17b397cdd0725c8d17879836d30bdc09a"} Dec 06 15:51:39 crc kubenswrapper[4848]: W1206 15:51:39.634268 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc999bbb5_2904_48f5_bfa0_48a0ce1692d7.slice/crio-e3bd4dfa33d5834fb702aea8b8998c10465cf88e86d2380cfb18c4d55b47eea5 WatchSource:0}: Error finding container e3bd4dfa33d5834fb702aea8b8998c10465cf88e86d2380cfb18c4d55b47eea5: Status 404 returned error can't find the container with id e3bd4dfa33d5834fb702aea8b8998c10465cf88e86d2380cfb18c4d55b47eea5 Dec 06 15:51:39 crc kubenswrapper[4848]: I1206 15:51:39.636195 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-n4hw7"] Dec 06 15:51:40 crc kubenswrapper[4848]: I1206 15:51:40.620729 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a0084de-2a42-4cd7-a5ce-67c1770870b2","Type":"ContainerStarted","Data":"057c125a2a7112c63d76fcf6682234511c97d70858b2e5f08ee05d6d94709489"} Dec 06 15:51:40 crc kubenswrapper[4848]: I1206 15:51:40.622203 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-n4hw7" event={"ID":"c999bbb5-2904-48f5-bfa0-48a0ce1692d7","Type":"ContainerStarted","Data":"ff268bc4d2434693e8ae701076e75a12788362c1cd5f3b4d79696327851e1617"} Dec 06 15:51:40 crc kubenswrapper[4848]: I1206 15:51:40.622244 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-n4hw7" event={"ID":"c999bbb5-2904-48f5-bfa0-48a0ce1692d7","Type":"ContainerStarted","Data":"e3bd4dfa33d5834fb702aea8b8998c10465cf88e86d2380cfb18c4d55b47eea5"} Dec 06 15:51:40 crc kubenswrapper[4848]: I1206 15:51:40.625787 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b9eb3e0-070e-4129-804a-6a7e68fb0f66","Type":"ContainerStarted","Data":"4ad44a842e01abb588c8bf21a2d9915c7ccb28f2b9c34df98184c588bf8eaac7"} Dec 06 15:51:40 crc kubenswrapper[4848]: I1206 15:51:40.625822 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b9eb3e0-070e-4129-804a-6a7e68fb0f66","Type":"ContainerStarted","Data":"356534065a438483fa361d1867815aedca5b19a1fef8423d1f3b78a9474c5789"} Dec 06 15:51:40 crc kubenswrapper[4848]: I1206 15:51:40.644471 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-n4hw7" podStartSLOduration=2.64444939 podStartE2EDuration="2.64444939s" podCreationTimestamp="2025-12-06 15:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:40.639740512 +0000 UTC m=+1367.937751435" watchObservedRunningTime="2025-12-06 15:51:40.64444939 +0000 UTC m=+1367.942460303" Dec 06 15:51:40 crc kubenswrapper[4848]: I1206 15:51:40.677076 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6770559990000002 podStartE2EDuration="2.677055999s" podCreationTimestamp="2025-12-06 15:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:40.665608911 +0000 UTC m=+1367.963619824" watchObservedRunningTime="2025-12-06 15:51:40.677055999 +0000 UTC m=+1367.975066912" Dec 06 15:51:40 crc kubenswrapper[4848]: I1206 15:51:40.976854 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-qpndw" Dec 06 15:51:41 crc kubenswrapper[4848]: I1206 15:51:41.062642 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mjqzp"] Dec 06 15:51:41 crc kubenswrapper[4848]: I1206 15:51:41.062919 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" podUID="21ef0ec0-5ab7-4256-920f-da903c1e4548" containerName="dnsmasq-dns" containerID="cri-o://6c91bc25ae68e9b53e31425455d64d2116673f043295ae1291a349325e3df008" gracePeriod=10 Dec 06 15:51:41 crc kubenswrapper[4848]: I1206 15:51:41.375058 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" podUID="21ef0ec0-5ab7-4256-920f-da903c1e4548" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.195:5353: connect: connection refused" Dec 06 15:51:41 crc kubenswrapper[4848]: I1206 15:51:41.639293 4848 generic.go:334] "Generic (PLEG): container finished" podID="21ef0ec0-5ab7-4256-920f-da903c1e4548" containerID="6c91bc25ae68e9b53e31425455d64d2116673f043295ae1291a349325e3df008" exitCode=0 Dec 06 15:51:41 crc kubenswrapper[4848]: I1206 15:51:41.639357 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" event={"ID":"21ef0ec0-5ab7-4256-920f-da903c1e4548","Type":"ContainerDied","Data":"6c91bc25ae68e9b53e31425455d64d2116673f043295ae1291a349325e3df008"} Dec 06 15:51:41 crc kubenswrapper[4848]: I1206 15:51:41.642255 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a0084de-2a42-4cd7-a5ce-67c1770870b2","Type":"ContainerStarted","Data":"aefcef6b9ba3cdf0d01dd6e440eacfdad2967aa23faad273728893fec3b8259a"} Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.251055 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.444259 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-svc\") pod \"21ef0ec0-5ab7-4256-920f-da903c1e4548\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.445040 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-config\") pod \"21ef0ec0-5ab7-4256-920f-da903c1e4548\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.445258 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4nbf\" (UniqueName: \"kubernetes.io/projected/21ef0ec0-5ab7-4256-920f-da903c1e4548-kube-api-access-t4nbf\") pod \"21ef0ec0-5ab7-4256-920f-da903c1e4548\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.445468 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-swift-storage-0\") pod \"21ef0ec0-5ab7-4256-920f-da903c1e4548\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.445892 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-nb\") pod \"21ef0ec0-5ab7-4256-920f-da903c1e4548\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.446050 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-sb\") pod \"21ef0ec0-5ab7-4256-920f-da903c1e4548\" (UID: \"21ef0ec0-5ab7-4256-920f-da903c1e4548\") " Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.451622 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ef0ec0-5ab7-4256-920f-da903c1e4548-kube-api-access-t4nbf" (OuterVolumeSpecName: "kube-api-access-t4nbf") pod "21ef0ec0-5ab7-4256-920f-da903c1e4548" (UID: "21ef0ec0-5ab7-4256-920f-da903c1e4548"). InnerVolumeSpecName "kube-api-access-t4nbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.508545 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21ef0ec0-5ab7-4256-920f-da903c1e4548" (UID: "21ef0ec0-5ab7-4256-920f-da903c1e4548"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.516202 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21ef0ec0-5ab7-4256-920f-da903c1e4548" (UID: "21ef0ec0-5ab7-4256-920f-da903c1e4548"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.521622 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-config" (OuterVolumeSpecName: "config") pod "21ef0ec0-5ab7-4256-920f-da903c1e4548" (UID: "21ef0ec0-5ab7-4256-920f-da903c1e4548"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.529012 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21ef0ec0-5ab7-4256-920f-da903c1e4548" (UID: "21ef0ec0-5ab7-4256-920f-da903c1e4548"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.533180 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21ef0ec0-5ab7-4256-920f-da903c1e4548" (UID: "21ef0ec0-5ab7-4256-920f-da903c1e4548"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.548410 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.548450 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.548463 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-config\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.548476 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4nbf\" (UniqueName: \"kubernetes.io/projected/21ef0ec0-5ab7-4256-920f-da903c1e4548-kube-api-access-t4nbf\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.548490 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.548500 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ef0ec0-5ab7-4256-920f-da903c1e4548-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.652777 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" event={"ID":"21ef0ec0-5ab7-4256-920f-da903c1e4548","Type":"ContainerDied","Data":"560c6010daac26c16221018cd8feeea3067d40b90c8e83aa06588a7b6e5243b6"} Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.652867 4848 scope.go:117] "RemoveContainer" containerID="6c91bc25ae68e9b53e31425455d64d2116673f043295ae1291a349325e3df008" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.652808 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-mjqzp" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.658741 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a0084de-2a42-4cd7-a5ce-67c1770870b2","Type":"ContainerStarted","Data":"7f015f6344d8379e189b1234aec395a2ab44f6f783380ab1d29f3861eb5040ad"} Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.659676 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.680305 4848 scope.go:117] "RemoveContainer" containerID="4b5fe1ac7296c94ca393ab01fad2a2276f2192c9a4e7077ce20e288add87c128" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.690094 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.135494726 podStartE2EDuration="5.690070821s" podCreationTimestamp="2025-12-06 15:51:37 +0000 UTC" firstStartedPulling="2025-12-06 15:51:38.459402328 +0000 UTC m=+1365.757413241" lastFinishedPulling="2025-12-06 15:51:42.013978423 +0000 UTC m=+1369.311989336" observedRunningTime="2025-12-06 15:51:42.68557662 +0000 UTC m=+1369.983587553" watchObservedRunningTime="2025-12-06 15:51:42.690070821 +0000 UTC m=+1369.988081744" Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.723804 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mjqzp"] Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.732817 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-mjqzp"] Dec 06 15:51:42 crc kubenswrapper[4848]: I1206 15:51:42.976502 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ef0ec0-5ab7-4256-920f-da903c1e4548" path="/var/lib/kubelet/pods/21ef0ec0-5ab7-4256-920f-da903c1e4548/volumes" Dec 06 15:51:45 crc kubenswrapper[4848]: I1206 15:51:45.687964 4848 generic.go:334] "Generic (PLEG): container finished" podID="c999bbb5-2904-48f5-bfa0-48a0ce1692d7" containerID="ff268bc4d2434693e8ae701076e75a12788362c1cd5f3b4d79696327851e1617" exitCode=0 Dec 06 15:51:45 crc kubenswrapper[4848]: I1206 15:51:45.688273 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-n4hw7" event={"ID":"c999bbb5-2904-48f5-bfa0-48a0ce1692d7","Type":"ContainerDied","Data":"ff268bc4d2434693e8ae701076e75a12788362c1cd5f3b4d79696327851e1617"} Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.165887 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.352418 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-scripts\") pod \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.352571 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-config-data\") pod \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.352626 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-combined-ca-bundle\") pod \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.352672 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgrkx\" (UniqueName: \"kubernetes.io/projected/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-kube-api-access-mgrkx\") pod \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\" (UID: \"c999bbb5-2904-48f5-bfa0-48a0ce1692d7\") " Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.357893 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-scripts" (OuterVolumeSpecName: "scripts") pod "c999bbb5-2904-48f5-bfa0-48a0ce1692d7" (UID: "c999bbb5-2904-48f5-bfa0-48a0ce1692d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.358127 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-kube-api-access-mgrkx" (OuterVolumeSpecName: "kube-api-access-mgrkx") pod "c999bbb5-2904-48f5-bfa0-48a0ce1692d7" (UID: "c999bbb5-2904-48f5-bfa0-48a0ce1692d7"). InnerVolumeSpecName "kube-api-access-mgrkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.389303 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c999bbb5-2904-48f5-bfa0-48a0ce1692d7" (UID: "c999bbb5-2904-48f5-bfa0-48a0ce1692d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.389362 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-config-data" (OuterVolumeSpecName: "config-data") pod "c999bbb5-2904-48f5-bfa0-48a0ce1692d7" (UID: "c999bbb5-2904-48f5-bfa0-48a0ce1692d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.455060 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.455099 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.455111 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.455120 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgrkx\" (UniqueName: \"kubernetes.io/projected/c999bbb5-2904-48f5-bfa0-48a0ce1692d7-kube-api-access-mgrkx\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.707651 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-n4hw7" event={"ID":"c999bbb5-2904-48f5-bfa0-48a0ce1692d7","Type":"ContainerDied","Data":"e3bd4dfa33d5834fb702aea8b8998c10465cf88e86d2380cfb18c4d55b47eea5"} Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.708168 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3bd4dfa33d5834fb702aea8b8998c10465cf88e86d2380cfb18c4d55b47eea5" Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.707706 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-n4hw7" Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.886944 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.887287 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fb793c37-98f3-42b2-be40-3778672cb7d6" containerName="nova-scheduler-scheduler" containerID="cri-o://49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378" gracePeriod=30 Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.897438 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.897668 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1b9eb3e0-070e-4129-804a-6a7e68fb0f66" containerName="nova-api-log" containerID="cri-o://356534065a438483fa361d1867815aedca5b19a1fef8423d1f3b78a9474c5789" gracePeriod=30 Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.898074 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1b9eb3e0-070e-4129-804a-6a7e68fb0f66" containerName="nova-api-api" containerID="cri-o://4ad44a842e01abb588c8bf21a2d9915c7ccb28f2b9c34df98184c588bf8eaac7" gracePeriod=30 Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.920900 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.921136 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-log" containerID="cri-o://05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119" gracePeriod=30 Dec 06 15:51:47 crc kubenswrapper[4848]: I1206 15:51:47.921933 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-metadata" containerID="cri-o://a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd" gracePeriod=30 Dec 06 15:51:48 crc kubenswrapper[4848]: I1206 15:51:48.720804 4848 generic.go:334] "Generic (PLEG): container finished" podID="1b9eb3e0-070e-4129-804a-6a7e68fb0f66" containerID="4ad44a842e01abb588c8bf21a2d9915c7ccb28f2b9c34df98184c588bf8eaac7" exitCode=0 Dec 06 15:51:48 crc kubenswrapper[4848]: I1206 15:51:48.721079 4848 generic.go:334] "Generic (PLEG): container finished" podID="1b9eb3e0-070e-4129-804a-6a7e68fb0f66" containerID="356534065a438483fa361d1867815aedca5b19a1fef8423d1f3b78a9474c5789" exitCode=143 Dec 06 15:51:48 crc kubenswrapper[4848]: I1206 15:51:48.720900 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b9eb3e0-070e-4129-804a-6a7e68fb0f66","Type":"ContainerDied","Data":"4ad44a842e01abb588c8bf21a2d9915c7ccb28f2b9c34df98184c588bf8eaac7"} Dec 06 15:51:48 crc kubenswrapper[4848]: I1206 15:51:48.721173 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b9eb3e0-070e-4129-804a-6a7e68fb0f66","Type":"ContainerDied","Data":"356534065a438483fa361d1867815aedca5b19a1fef8423d1f3b78a9474c5789"} Dec 06 15:51:48 crc kubenswrapper[4848]: I1206 15:51:48.723811 4848 generic.go:334] "Generic (PLEG): container finished" podID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerID="05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119" exitCode=143 Dec 06 15:51:48 crc kubenswrapper[4848]: I1206 15:51:48.723850 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc","Type":"ContainerDied","Data":"05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119"} Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.043444 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.085121 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-logs\") pod \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.085408 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c55fc\" (UniqueName: \"kubernetes.io/projected/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-kube-api-access-c55fc\") pod \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.085447 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-config-data\") pod \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.085474 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-internal-tls-certs\") pod \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.085524 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-public-tls-certs\") pod \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.085595 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-combined-ca-bundle\") pod \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\" (UID: \"1b9eb3e0-070e-4129-804a-6a7e68fb0f66\") " Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.086283 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-logs" (OuterVolumeSpecName: "logs") pod "1b9eb3e0-070e-4129-804a-6a7e68fb0f66" (UID: "1b9eb3e0-070e-4129-804a-6a7e68fb0f66"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.096994 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-kube-api-access-c55fc" (OuterVolumeSpecName: "kube-api-access-c55fc") pod "1b9eb3e0-070e-4129-804a-6a7e68fb0f66" (UID: "1b9eb3e0-070e-4129-804a-6a7e68fb0f66"). InnerVolumeSpecName "kube-api-access-c55fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.123865 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-config-data" (OuterVolumeSpecName: "config-data") pod "1b9eb3e0-070e-4129-804a-6a7e68fb0f66" (UID: "1b9eb3e0-070e-4129-804a-6a7e68fb0f66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.135833 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b9eb3e0-070e-4129-804a-6a7e68fb0f66" (UID: "1b9eb3e0-070e-4129-804a-6a7e68fb0f66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.147852 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1b9eb3e0-070e-4129-804a-6a7e68fb0f66" (UID: "1b9eb3e0-070e-4129-804a-6a7e68fb0f66"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.160923 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1b9eb3e0-070e-4129-804a-6a7e68fb0f66" (UID: "1b9eb3e0-070e-4129-804a-6a7e68fb0f66"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.187016 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c55fc\" (UniqueName: \"kubernetes.io/projected/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-kube-api-access-c55fc\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.187065 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.187079 4848 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.187087 4848 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.187095 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.187103 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b9eb3e0-070e-4129-804a-6a7e68fb0f66-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.734047 4848 generic.go:334] "Generic (PLEG): container finished" podID="fb793c37-98f3-42b2-be40-3778672cb7d6" containerID="49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378" exitCode=0 Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.734138 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb793c37-98f3-42b2-be40-3778672cb7d6","Type":"ContainerDied","Data":"49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378"} Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.737133 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b9eb3e0-070e-4129-804a-6a7e68fb0f66","Type":"ContainerDied","Data":"b030a3930a257a44a62b3b39b02054e17b397cdd0725c8d17879836d30bdc09a"} Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.737176 4848 scope.go:117] "RemoveContainer" containerID="4ad44a842e01abb588c8bf21a2d9915c7ccb28f2b9c34df98184c588bf8eaac7" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.737178 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.772080 4848 scope.go:117] "RemoveContainer" containerID="356534065a438483fa361d1867815aedca5b19a1fef8423d1f3b78a9474c5789" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.794650 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.807779 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.821672 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:49 crc kubenswrapper[4848]: E1206 15:51:49.823141 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9eb3e0-070e-4129-804a-6a7e68fb0f66" containerName="nova-api-api" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.823164 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9eb3e0-070e-4129-804a-6a7e68fb0f66" containerName="nova-api-api" Dec 06 15:51:49 crc kubenswrapper[4848]: E1206 15:51:49.823187 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ef0ec0-5ab7-4256-920f-da903c1e4548" containerName="init" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.823198 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ef0ec0-5ab7-4256-920f-da903c1e4548" containerName="init" Dec 06 15:51:49 crc kubenswrapper[4848]: E1206 15:51:49.823232 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9eb3e0-070e-4129-804a-6a7e68fb0f66" containerName="nova-api-log" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.823240 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9eb3e0-070e-4129-804a-6a7e68fb0f66" containerName="nova-api-log" Dec 06 15:51:49 crc kubenswrapper[4848]: E1206 15:51:49.823252 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ef0ec0-5ab7-4256-920f-da903c1e4548" containerName="dnsmasq-dns" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.823259 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ef0ec0-5ab7-4256-920f-da903c1e4548" containerName="dnsmasq-dns" Dec 06 15:51:49 crc kubenswrapper[4848]: E1206 15:51:49.823282 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c999bbb5-2904-48f5-bfa0-48a0ce1692d7" containerName="nova-manage" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.823291 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c999bbb5-2904-48f5-bfa0-48a0ce1692d7" containerName="nova-manage" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.823514 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9eb3e0-070e-4129-804a-6a7e68fb0f66" containerName="nova-api-log" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.823533 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c999bbb5-2904-48f5-bfa0-48a0ce1692d7" containerName="nova-manage" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.823552 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ef0ec0-5ab7-4256-920f-da903c1e4548" containerName="dnsmasq-dns" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.823564 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9eb3e0-070e-4129-804a-6a7e68fb0f66" containerName="nova-api-api" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.824813 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.832036 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.832725 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.833141 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 15:51:49 crc kubenswrapper[4848]: I1206 15:51:49.842266 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.002477 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.003445 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rhz7\" (UniqueName: \"kubernetes.io/projected/310ce79c-5eaa-461b-b99c-9e4aee7849c4-kube-api-access-2rhz7\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.003610 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-public-tls-certs\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.003641 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-config-data\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.003672 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/310ce79c-5eaa-461b-b99c-9e4aee7849c4-logs\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.003950 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.106722 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.106804 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.106841 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rhz7\" (UniqueName: \"kubernetes.io/projected/310ce79c-5eaa-461b-b99c-9e4aee7849c4-kube-api-access-2rhz7\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.107172 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-public-tls-certs\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.107233 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-config-data\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.107292 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/310ce79c-5eaa-461b-b99c-9e4aee7849c4-logs\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.109362 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/310ce79c-5eaa-461b-b99c-9e4aee7849c4-logs\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.112240 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-config-data\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.112915 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-public-tls-certs\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.114110 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.116562 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310ce79c-5eaa-461b-b99c-9e4aee7849c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.129441 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rhz7\" (UniqueName: \"kubernetes.io/projected/310ce79c-5eaa-461b-b99c-9e4aee7849c4-kube-api-access-2rhz7\") pod \"nova-api-0\" (UID: \"310ce79c-5eaa-461b-b99c-9e4aee7849c4\") " pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.144429 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 15:51:50 crc kubenswrapper[4848]: E1206 15:51:50.160977 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378 is running failed: container process not found" containerID="49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 15:51:50 crc kubenswrapper[4848]: E1206 15:51:50.161596 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378 is running failed: container process not found" containerID="49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 15:51:50 crc kubenswrapper[4848]: E1206 15:51:50.161874 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378 is running failed: container process not found" containerID="49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 15:51:50 crc kubenswrapper[4848]: E1206 15:51:50.161910 4848 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fb793c37-98f3-42b2-be40-3778672cb7d6" containerName="nova-scheduler-scheduler" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.265966 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.416646 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m5kz\" (UniqueName: \"kubernetes.io/projected/fb793c37-98f3-42b2-be40-3778672cb7d6-kube-api-access-5m5kz\") pod \"fb793c37-98f3-42b2-be40-3778672cb7d6\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.418093 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-combined-ca-bundle\") pod \"fb793c37-98f3-42b2-be40-3778672cb7d6\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.418366 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-config-data\") pod \"fb793c37-98f3-42b2-be40-3778672cb7d6\" (UID: \"fb793c37-98f3-42b2-be40-3778672cb7d6\") " Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.422176 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb793c37-98f3-42b2-be40-3778672cb7d6-kube-api-access-5m5kz" (OuterVolumeSpecName: "kube-api-access-5m5kz") pod "fb793c37-98f3-42b2-be40-3778672cb7d6" (UID: "fb793c37-98f3-42b2-be40-3778672cb7d6"). InnerVolumeSpecName "kube-api-access-5m5kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.449920 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb793c37-98f3-42b2-be40-3778672cb7d6" (UID: "fb793c37-98f3-42b2-be40-3778672cb7d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.485926 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-config-data" (OuterVolumeSpecName: "config-data") pod "fb793c37-98f3-42b2-be40-3778672cb7d6" (UID: "fb793c37-98f3-42b2-be40-3778672cb7d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.525819 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.525855 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb793c37-98f3-42b2-be40-3778672cb7d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.525868 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m5kz\" (UniqueName: \"kubernetes.io/projected/fb793c37-98f3-42b2-be40-3778672cb7d6-kube-api-access-5m5kz\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.698537 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 15:51:50 crc kubenswrapper[4848]: W1206 15:51:50.701195 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod310ce79c_5eaa_461b_b99c_9e4aee7849c4.slice/crio-dc50c4027c3f2b45a9b21b0613181008ca637299ec13fce444d4dfe7eba72454 WatchSource:0}: Error finding container dc50c4027c3f2b45a9b21b0613181008ca637299ec13fce444d4dfe7eba72454: Status 404 returned error can't find the container with id dc50c4027c3f2b45a9b21b0613181008ca637299ec13fce444d4dfe7eba72454 Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.751995 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.752023 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb793c37-98f3-42b2-be40-3778672cb7d6","Type":"ContainerDied","Data":"ee19a670ee9ba6055042c69f5209b91792d579f8c982b6342f451580c1872ea1"} Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.752068 4848 scope.go:117] "RemoveContainer" containerID="49b89022fbeed5497e4086fc57b491ded73b202e176ab4deef560bc36ea6a378" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.757485 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"310ce79c-5eaa-461b-b99c-9e4aee7849c4","Type":"ContainerStarted","Data":"dc50c4027c3f2b45a9b21b0613181008ca637299ec13fce444d4dfe7eba72454"} Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.791099 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.808435 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.817991 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:50 crc kubenswrapper[4848]: E1206 15:51:50.818485 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb793c37-98f3-42b2-be40-3778672cb7d6" containerName="nova-scheduler-scheduler" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.818507 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb793c37-98f3-42b2-be40-3778672cb7d6" containerName="nova-scheduler-scheduler" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.818743 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb793c37-98f3-42b2-be40-3778672cb7d6" containerName="nova-scheduler-scheduler" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.819486 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.821153 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.827569 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.838822 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6e9497-4228-4848-962e-e319a0c0fdf4-config-data\") pod \"nova-scheduler-0\" (UID: \"4d6e9497-4228-4848-962e-e319a0c0fdf4\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.838946 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxsdl\" (UniqueName: \"kubernetes.io/projected/4d6e9497-4228-4848-962e-e319a0c0fdf4-kube-api-access-nxsdl\") pod \"nova-scheduler-0\" (UID: \"4d6e9497-4228-4848-962e-e319a0c0fdf4\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.838993 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6e9497-4228-4848-962e-e319a0c0fdf4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d6e9497-4228-4848-962e-e319a0c0fdf4\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.941027 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6e9497-4228-4848-962e-e319a0c0fdf4-config-data\") pod \"nova-scheduler-0\" (UID: \"4d6e9497-4228-4848-962e-e319a0c0fdf4\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.941147 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxsdl\" (UniqueName: \"kubernetes.io/projected/4d6e9497-4228-4848-962e-e319a0c0fdf4-kube-api-access-nxsdl\") pod \"nova-scheduler-0\" (UID: \"4d6e9497-4228-4848-962e-e319a0c0fdf4\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.941489 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6e9497-4228-4848-962e-e319a0c0fdf4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d6e9497-4228-4848-962e-e319a0c0fdf4\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.945436 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6e9497-4228-4848-962e-e319a0c0fdf4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d6e9497-4228-4848-962e-e319a0c0fdf4\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.945508 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6e9497-4228-4848-962e-e319a0c0fdf4-config-data\") pod \"nova-scheduler-0\" (UID: \"4d6e9497-4228-4848-962e-e319a0c0fdf4\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.962649 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxsdl\" (UniqueName: \"kubernetes.io/projected/4d6e9497-4228-4848-962e-e319a0c0fdf4-kube-api-access-nxsdl\") pod \"nova-scheduler-0\" (UID: \"4d6e9497-4228-4848-962e-e319a0c0fdf4\") " pod="openstack/nova-scheduler-0" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.980755 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9eb3e0-070e-4129-804a-6a7e68fb0f66" path="/var/lib/kubelet/pods/1b9eb3e0-070e-4129-804a-6a7e68fb0f66/volumes" Dec 06 15:51:50 crc kubenswrapper[4848]: I1206 15:51:50.981801 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb793c37-98f3-42b2-be40-3778672cb7d6" path="/var/lib/kubelet/pods/fb793c37-98f3-42b2-be40-3778672cb7d6/volumes" Dec 06 15:51:51 crc kubenswrapper[4848]: I1206 15:51:51.051000 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:45754->10.217.0.199:8775: read: connection reset by peer" Dec 06 15:51:51 crc kubenswrapper[4848]: I1206 15:51:51.051015 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:45756->10.217.0.199:8775: read: connection reset by peer" Dec 06 15:51:51 crc kubenswrapper[4848]: I1206 15:51:51.147584 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 15:51:51 crc kubenswrapper[4848]: I1206 15:51:51.621397 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 15:51:51 crc kubenswrapper[4848]: I1206 15:51:51.770946 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d6e9497-4228-4848-962e-e319a0c0fdf4","Type":"ContainerStarted","Data":"21b29f9f7f5271f1fe0408fa29754ada6a3aaafa8b89de912cb7418ee2a7919c"} Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.731863 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.785191 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"310ce79c-5eaa-461b-b99c-9e4aee7849c4","Type":"ContainerStarted","Data":"ee98d6a7506b64c359e9e15f2822eb59c8fc34f388b3d0c2ade485e2c62f6bec"} Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.785238 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"310ce79c-5eaa-461b-b99c-9e4aee7849c4","Type":"ContainerStarted","Data":"10d623768767fdde777e860df0b42aaac1b719c142c50ebd2c910d4ed0200978"} Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.793007 4848 generic.go:334] "Generic (PLEG): container finished" podID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerID="a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd" exitCode=0 Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.793082 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc","Type":"ContainerDied","Data":"a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd"} Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.793116 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc","Type":"ContainerDied","Data":"d5947b2ef4a7b9b8d80fbec5660243fa3ce2422d61650322fe2daa2aea0d6851"} Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.793137 4848 scope.go:117] "RemoveContainer" containerID="a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.793324 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.796269 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d6e9497-4228-4848-962e-e319a0c0fdf4","Type":"ContainerStarted","Data":"fa644244c47046fbdf7f61f51faa78e5a8b87139991eaad2ece1d6b96a87336d"} Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.826934 4848 scope.go:117] "RemoveContainer" containerID="05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.829209 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.829189465 podStartE2EDuration="3.829189465s" podCreationTimestamp="2025-12-06 15:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:52.799869084 +0000 UTC m=+1380.097879997" watchObservedRunningTime="2025-12-06 15:51:52.829189465 +0000 UTC m=+1380.127200378" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.835575 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.835557597 podStartE2EDuration="2.835557597s" podCreationTimestamp="2025-12-06 15:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:52.819862434 +0000 UTC m=+1380.117873347" watchObservedRunningTime="2025-12-06 15:51:52.835557597 +0000 UTC m=+1380.133568510" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.855333 4848 scope.go:117] "RemoveContainer" containerID="a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd" Dec 06 15:51:52 crc kubenswrapper[4848]: E1206 15:51:52.855980 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd\": container with ID starting with a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd not found: ID does not exist" containerID="a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.856123 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd"} err="failed to get container status \"a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd\": rpc error: code = NotFound desc = could not find container \"a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd\": container with ID starting with a053d36f690035e270131d0fdfabab5a68635ae49e4a7a5f17e54cc19de42fbd not found: ID does not exist" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.856208 4848 scope.go:117] "RemoveContainer" containerID="05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119" Dec 06 15:51:52 crc kubenswrapper[4848]: E1206 15:51:52.856719 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119\": container with ID starting with 05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119 not found: ID does not exist" containerID="05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.856774 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119"} err="failed to get container status \"05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119\": rpc error: code = NotFound desc = could not find container \"05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119\": container with ID starting with 05c78c64782628eb77cece55877b24f0cd2d47be37ca276bb866100853f71119 not found: ID does not exist" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.876743 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq4nv\" (UniqueName: \"kubernetes.io/projected/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-kube-api-access-lq4nv\") pod \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.876865 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-nova-metadata-tls-certs\") pod \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.876922 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-config-data\") pod \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.876981 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-logs\") pod \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.877091 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-combined-ca-bundle\") pod \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\" (UID: \"b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc\") " Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.878185 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-logs" (OuterVolumeSpecName: "logs") pod "b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" (UID: "b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.898440 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-kube-api-access-lq4nv" (OuterVolumeSpecName: "kube-api-access-lq4nv") pod "b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" (UID: "b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc"). InnerVolumeSpecName "kube-api-access-lq4nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.907587 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-config-data" (OuterVolumeSpecName: "config-data") pod "b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" (UID: "b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.932276 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" (UID: "b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.935513 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" (UID: "b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.979852 4848 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.979891 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.979904 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-logs\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.979916 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:52 crc kubenswrapper[4848]: I1206 15:51:52.979928 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq4nv\" (UniqueName: \"kubernetes.io/projected/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc-kube-api-access-lq4nv\") on node \"crc\" DevicePath \"\"" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.130274 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.142978 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.153513 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:53 crc kubenswrapper[4848]: E1206 15:51:53.154218 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-log" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.154329 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-log" Dec 06 15:51:53 crc kubenswrapper[4848]: E1206 15:51:53.154454 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-metadata" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.154538 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-metadata" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.154933 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-log" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.155074 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" containerName="nova-metadata-metadata" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.156286 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.159087 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.159463 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.162728 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.284894 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daebc9f0-bb2e-4deb-aa48-00553c450e81-config-data\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.284957 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daebc9f0-bb2e-4deb-aa48-00553c450e81-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.285046 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv4d6\" (UniqueName: \"kubernetes.io/projected/daebc9f0-bb2e-4deb-aa48-00553c450e81-kube-api-access-mv4d6\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.285088 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daebc9f0-bb2e-4deb-aa48-00553c450e81-logs\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.285127 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/daebc9f0-bb2e-4deb-aa48-00553c450e81-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.387440 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv4d6\" (UniqueName: \"kubernetes.io/projected/daebc9f0-bb2e-4deb-aa48-00553c450e81-kube-api-access-mv4d6\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.387544 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daebc9f0-bb2e-4deb-aa48-00553c450e81-logs\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.387580 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/daebc9f0-bb2e-4deb-aa48-00553c450e81-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.387727 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daebc9f0-bb2e-4deb-aa48-00553c450e81-config-data\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.387782 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daebc9f0-bb2e-4deb-aa48-00553c450e81-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.388365 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daebc9f0-bb2e-4deb-aa48-00553c450e81-logs\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.392130 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/daebc9f0-bb2e-4deb-aa48-00553c450e81-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.392239 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daebc9f0-bb2e-4deb-aa48-00553c450e81-config-data\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.392241 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daebc9f0-bb2e-4deb-aa48-00553c450e81-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.408842 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv4d6\" (UniqueName: \"kubernetes.io/projected/daebc9f0-bb2e-4deb-aa48-00553c450e81-kube-api-access-mv4d6\") pod \"nova-metadata-0\" (UID: \"daebc9f0-bb2e-4deb-aa48-00553c450e81\") " pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.475636 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 15:51:53 crc kubenswrapper[4848]: I1206 15:51:53.953948 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 15:51:54 crc kubenswrapper[4848]: I1206 15:51:54.824647 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"daebc9f0-bb2e-4deb-aa48-00553c450e81","Type":"ContainerStarted","Data":"6b436c00ef6601c74e48d032f7b82f09dfcde9495d60b9eab5c6fa132716312a"} Dec 06 15:51:54 crc kubenswrapper[4848]: I1206 15:51:54.825243 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"daebc9f0-bb2e-4deb-aa48-00553c450e81","Type":"ContainerStarted","Data":"8b3e01cfe5ab19e9f1f707fb20945b915a970c84ddf34d8c5d17a1e973ddc7eb"} Dec 06 15:51:54 crc kubenswrapper[4848]: I1206 15:51:54.825258 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"daebc9f0-bb2e-4deb-aa48-00553c450e81","Type":"ContainerStarted","Data":"b8f426b09f53e86ec1f04f0b1ac9020b9b0eb295ae4f51a7b2bec9cf0e4fd584"} Dec 06 15:51:54 crc kubenswrapper[4848]: I1206 15:51:54.850361 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.850340846 podStartE2EDuration="1.850340846s" podCreationTimestamp="2025-12-06 15:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:51:54.847830248 +0000 UTC m=+1382.145841161" watchObservedRunningTime="2025-12-06 15:51:54.850340846 +0000 UTC m=+1382.148351759" Dec 06 15:51:54 crc kubenswrapper[4848]: I1206 15:51:54.978227 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc" path="/var/lib/kubelet/pods/b7eb423f-2775-4ea0-8c6e-d7dc2542e1cc/volumes" Dec 06 15:51:56 crc kubenswrapper[4848]: E1206 15:51:56.036651 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice/crio-d5947b2ef4a7b9b8d80fbec5660243fa3ce2422d61650322fe2daa2aea0d6851\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice\": RecentStats: unable to find data in memory cache]" Dec 06 15:51:56 crc kubenswrapper[4848]: I1206 15:51:56.148516 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 15:51:58 crc kubenswrapper[4848]: I1206 15:51:58.476443 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 15:51:58 crc kubenswrapper[4848]: I1206 15:51:58.476808 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 15:52:00 crc kubenswrapper[4848]: I1206 15:52:00.145929 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 15:52:00 crc kubenswrapper[4848]: I1206 15:52:00.145987 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 15:52:01 crc kubenswrapper[4848]: I1206 15:52:01.148220 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 15:52:01 crc kubenswrapper[4848]: I1206 15:52:01.160882 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="310ce79c-5eaa-461b-b99c-9e4aee7849c4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 15:52:01 crc kubenswrapper[4848]: I1206 15:52:01.160919 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="310ce79c-5eaa-461b-b99c-9e4aee7849c4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 15:52:01 crc kubenswrapper[4848]: I1206 15:52:01.181944 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 15:52:01 crc kubenswrapper[4848]: I1206 15:52:01.931677 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 15:52:03 crc kubenswrapper[4848]: I1206 15:52:03.476761 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 15:52:03 crc kubenswrapper[4848]: I1206 15:52:03.477051 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 15:52:04 crc kubenswrapper[4848]: I1206 15:52:04.489833 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="daebc9f0-bb2e-4deb-aa48-00553c450e81" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 15:52:04 crc kubenswrapper[4848]: I1206 15:52:04.489865 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="daebc9f0-bb2e-4deb-aa48-00553c450e81" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 15:52:06 crc kubenswrapper[4848]: E1206 15:52:06.312551 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice/crio-d5947b2ef4a7b9b8d80fbec5660243fa3ce2422d61650322fe2daa2aea0d6851\": RecentStats: unable to find data in memory cache]" Dec 06 15:52:07 crc kubenswrapper[4848]: I1206 15:52:07.977807 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 15:52:10 crc kubenswrapper[4848]: I1206 15:52:10.153899 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 15:52:10 crc kubenswrapper[4848]: I1206 15:52:10.154782 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 15:52:10 crc kubenswrapper[4848]: I1206 15:52:10.155584 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 15:52:10 crc kubenswrapper[4848]: I1206 15:52:10.161937 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 15:52:10 crc kubenswrapper[4848]: I1206 15:52:10.999004 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 15:52:11 crc kubenswrapper[4848]: I1206 15:52:11.006614 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 15:52:13 crc kubenswrapper[4848]: I1206 15:52:13.482122 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 15:52:13 crc kubenswrapper[4848]: I1206 15:52:13.482589 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 15:52:13 crc kubenswrapper[4848]: I1206 15:52:13.487563 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 15:52:13 crc kubenswrapper[4848]: I1206 15:52:13.489405 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 15:52:16 crc kubenswrapper[4848]: E1206 15:52:16.572687 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice/crio-d5947b2ef4a7b9b8d80fbec5660243fa3ce2422d61650322fe2daa2aea0d6851\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice\": RecentStats: unable to find data in memory cache]" Dec 06 15:52:22 crc kubenswrapper[4848]: I1206 15:52:22.197085 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 15:52:23 crc kubenswrapper[4848]: I1206 15:52:23.001990 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 15:52:26 crc kubenswrapper[4848]: I1206 15:52:26.327318 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" containerName="rabbitmq" containerID="cri-o://35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085" gracePeriod=604796 Dec 06 15:52:26 crc kubenswrapper[4848]: I1206 15:52:26.748731 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="dda76265-1c2c-4409-8460-99bc3ab509c6" containerName="rabbitmq" containerID="cri-o://8413d8e7b39ea3e60a00900db0021fc2f01a010ecbc78475875e6f6ff9166990" gracePeriod=604797 Dec 06 15:52:26 crc kubenswrapper[4848]: E1206 15:52:26.830250 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice/crio-d5947b2ef4a7b9b8d80fbec5660243fa3ce2422d61650322fe2daa2aea0d6851\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice\": RecentStats: unable to find data in memory cache]" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.019868 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.083472 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ct5c\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-kube-api-access-9ct5c\") pod \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.083523 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-plugins\") pod \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.083549 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-config-data\") pod \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.083586 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-tls\") pod \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.083603 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-pod-info\") pod \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.083755 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-erlang-cookie\") pod \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.083804 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.083839 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-server-conf\") pod \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.083893 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-confd\") pod \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.083918 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-erlang-cookie-secret\") pod \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.083937 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-plugins-conf\") pod \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\" (UID: \"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.084010 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" (UID: "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.084359 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.084616 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" (UID: "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.084785 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" (UID: "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.096349 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" (UID: "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.098557 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" (UID: "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.099245 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-pod-info" (OuterVolumeSpecName: "pod-info") pod "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" (UID: "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.103914 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" (UID: "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.115147 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-kube-api-access-9ct5c" (OuterVolumeSpecName: "kube-api-access-9ct5c") pod "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" (UID: "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75"). InnerVolumeSpecName "kube-api-access-9ct5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.168049 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-config-data" (OuterVolumeSpecName: "config-data") pod "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" (UID: "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.186145 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ct5c\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-kube-api-access-9ct5c\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.186176 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.186187 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.186195 4848 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.186204 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.186228 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.186248 4848 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.186256 4848 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.202956 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-server-conf" (OuterVolumeSpecName: "server-conf") pod "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" (UID: "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.217561 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.232951 4848 generic.go:334] "Generic (PLEG): container finished" podID="dda76265-1c2c-4409-8460-99bc3ab509c6" containerID="8413d8e7b39ea3e60a00900db0021fc2f01a010ecbc78475875e6f6ff9166990" exitCode=0 Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.233089 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dda76265-1c2c-4409-8460-99bc3ab509c6","Type":"ContainerDied","Data":"8413d8e7b39ea3e60a00900db0021fc2f01a010ecbc78475875e6f6ff9166990"} Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.235540 4848 generic.go:334] "Generic (PLEG): container finished" podID="b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" containerID="35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085" exitCode=0 Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.235614 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75","Type":"ContainerDied","Data":"35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085"} Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.235632 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75","Type":"ContainerDied","Data":"959b8132da20e7c73f4fe18f58777e716e0a81d28816984e7c4ed67dbf73bb93"} Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.235649 4848 scope.go:117] "RemoveContainer" containerID="35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.235853 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.265181 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.276825 4848 scope.go:117] "RemoveContainer" containerID="ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.295414 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-erlang-cookie\") pod \"dda76265-1c2c-4409-8460-99bc3ab509c6\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.295498 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-tls\") pod \"dda76265-1c2c-4409-8460-99bc3ab509c6\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.295525 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-plugins-conf\") pod \"dda76265-1c2c-4409-8460-99bc3ab509c6\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.295581 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dda76265-1c2c-4409-8460-99bc3ab509c6-pod-info\") pod \"dda76265-1c2c-4409-8460-99bc3ab509c6\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.295603 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-config-data\") pod \"dda76265-1c2c-4409-8460-99bc3ab509c6\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.295634 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-plugins\") pod \"dda76265-1c2c-4409-8460-99bc3ab509c6\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.295654 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"dda76265-1c2c-4409-8460-99bc3ab509c6\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.295723 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dda76265-1c2c-4409-8460-99bc3ab509c6-erlang-cookie-secret\") pod \"dda76265-1c2c-4409-8460-99bc3ab509c6\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.295787 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjd4k\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-kube-api-access-gjd4k\") pod \"dda76265-1c2c-4409-8460-99bc3ab509c6\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.295858 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-server-conf\") pod \"dda76265-1c2c-4409-8460-99bc3ab509c6\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.295874 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-confd\") pod \"dda76265-1c2c-4409-8460-99bc3ab509c6\" (UID: \"dda76265-1c2c-4409-8460-99bc3ab509c6\") " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.296235 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.296246 4848 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.314900 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "dda76265-1c2c-4409-8460-99bc3ab509c6" (UID: "dda76265-1c2c-4409-8460-99bc3ab509c6"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.315162 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dda76265-1c2c-4409-8460-99bc3ab509c6" (UID: "dda76265-1c2c-4409-8460-99bc3ab509c6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.315406 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dda76265-1c2c-4409-8460-99bc3ab509c6" (UID: "dda76265-1c2c-4409-8460-99bc3ab509c6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.315987 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dda76265-1c2c-4409-8460-99bc3ab509c6" (UID: "dda76265-1c2c-4409-8460-99bc3ab509c6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.357673 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dda76265-1c2c-4409-8460-99bc3ab509c6-pod-info" (OuterVolumeSpecName: "pod-info") pod "dda76265-1c2c-4409-8460-99bc3ab509c6" (UID: "dda76265-1c2c-4409-8460-99bc3ab509c6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.366901 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "dda76265-1c2c-4409-8460-99bc3ab509c6" (UID: "dda76265-1c2c-4409-8460-99bc3ab509c6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.393958 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-kube-api-access-gjd4k" (OuterVolumeSpecName: "kube-api-access-gjd4k") pod "dda76265-1c2c-4409-8460-99bc3ab509c6" (UID: "dda76265-1c2c-4409-8460-99bc3ab509c6"). InnerVolumeSpecName "kube-api-access-gjd4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.394074 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda76265-1c2c-4409-8460-99bc3ab509c6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dda76265-1c2c-4409-8460-99bc3ab509c6" (UID: "dda76265-1c2c-4409-8460-99bc3ab509c6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.394199 4848 scope.go:117] "RemoveContainer" containerID="35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085" Dec 06 15:52:33 crc kubenswrapper[4848]: E1206 15:52:33.394711 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085\": container with ID starting with 35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085 not found: ID does not exist" containerID="35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.394750 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085"} err="failed to get container status \"35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085\": rpc error: code = NotFound desc = could not find container \"35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085\": container with ID starting with 35198894c737aed3d8c10cfca44e2f29c3f228f3f218c3b8f0a33ad5208c2085 not found: ID does not exist" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.394788 4848 scope.go:117] "RemoveContainer" containerID="ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade" Dec 06 15:52:33 crc kubenswrapper[4848]: E1206 15:52:33.395135 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade\": container with ID starting with ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade not found: ID does not exist" containerID="ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.395177 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade"} err="failed to get container status \"ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade\": rpc error: code = NotFound desc = could not find container \"ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade\": container with ID starting with ec1ab24f36f2af9fdaad736116e1d1f994c6e8d844c48c53b6453964c53b0ade not found: ID does not exist" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.407592 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.407627 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.407638 4848 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.407646 4848 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dda76265-1c2c-4409-8460-99bc3ab509c6-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.407654 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.407673 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.407684 4848 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dda76265-1c2c-4409-8460-99bc3ab509c6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.407708 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjd4k\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-kube-api-access-gjd4k\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.408966 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-config-data" (OuterVolumeSpecName: "config-data") pod "dda76265-1c2c-4409-8460-99bc3ab509c6" (UID: "dda76265-1c2c-4409-8460-99bc3ab509c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.445284 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" (UID: "b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.465780 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.467215 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-server-conf" (OuterVolumeSpecName: "server-conf") pod "dda76265-1c2c-4409-8460-99bc3ab509c6" (UID: "dda76265-1c2c-4409-8460-99bc3ab509c6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.509383 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.509409 4848 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.509421 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.509429 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dda76265-1c2c-4409-8460-99bc3ab509c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.532279 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dda76265-1c2c-4409-8460-99bc3ab509c6" (UID: "dda76265-1c2c-4409-8460-99bc3ab509c6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.579839 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.600163 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.610570 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dda76265-1c2c-4409-8460-99bc3ab509c6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.616769 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 15:52:33 crc kubenswrapper[4848]: E1206 15:52:33.617159 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" containerName="rabbitmq" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.617174 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" containerName="rabbitmq" Dec 06 15:52:33 crc kubenswrapper[4848]: E1206 15:52:33.617195 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda76265-1c2c-4409-8460-99bc3ab509c6" containerName="rabbitmq" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.617201 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda76265-1c2c-4409-8460-99bc3ab509c6" containerName="rabbitmq" Dec 06 15:52:33 crc kubenswrapper[4848]: E1206 15:52:33.617212 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda76265-1c2c-4409-8460-99bc3ab509c6" containerName="setup-container" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.617218 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda76265-1c2c-4409-8460-99bc3ab509c6" containerName="setup-container" Dec 06 15:52:33 crc kubenswrapper[4848]: E1206 15:52:33.617231 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" containerName="setup-container" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.617237 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" containerName="setup-container" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.617426 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda76265-1c2c-4409-8460-99bc3ab509c6" containerName="rabbitmq" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.617440 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" containerName="rabbitmq" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.618450 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.624348 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.624546 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.624719 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.624828 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.624897 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.624965 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vd4vp" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.625001 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.629252 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.713599 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.713676 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d58ead1c-d7f6-4643-a869-8566f5d9843b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.713780 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d58ead1c-d7f6-4643-a869-8566f5d9843b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.713803 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.713836 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.713873 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d58ead1c-d7f6-4643-a869-8566f5d9843b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.713890 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d58ead1c-d7f6-4643-a869-8566f5d9843b-config-data\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.713904 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d58ead1c-d7f6-4643-a869-8566f5d9843b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.713940 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdl9g\" (UniqueName: \"kubernetes.io/projected/d58ead1c-d7f6-4643-a869-8566f5d9843b-kube-api-access-tdl9g\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.713967 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.714023 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.815166 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdl9g\" (UniqueName: \"kubernetes.io/projected/d58ead1c-d7f6-4643-a869-8566f5d9843b-kube-api-access-tdl9g\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.815214 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.815265 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.815310 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.815334 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d58ead1c-d7f6-4643-a869-8566f5d9843b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.815362 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d58ead1c-d7f6-4643-a869-8566f5d9843b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.815381 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.815408 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.815439 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d58ead1c-d7f6-4643-a869-8566f5d9843b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.815459 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d58ead1c-d7f6-4643-a869-8566f5d9843b-config-data\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.815477 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d58ead1c-d7f6-4643-a869-8566f5d9843b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.816382 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d58ead1c-d7f6-4643-a869-8566f5d9843b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.816441 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.816864 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.817212 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.817320 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d58ead1c-d7f6-4643-a869-8566f5d9843b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.818000 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d58ead1c-d7f6-4643-a869-8566f5d9843b-config-data\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.820775 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d58ead1c-d7f6-4643-a869-8566f5d9843b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.820811 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.821431 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d58ead1c-d7f6-4643-a869-8566f5d9843b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.827485 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d58ead1c-d7f6-4643-a869-8566f5d9843b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.839263 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdl9g\" (UniqueName: \"kubernetes.io/projected/d58ead1c-d7f6-4643-a869-8566f5d9843b-kube-api-access-tdl9g\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.857828 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d58ead1c-d7f6-4643-a869-8566f5d9843b\") " pod="openstack/rabbitmq-server-0" Dec 06 15:52:33 crc kubenswrapper[4848]: I1206 15:52:33.977678 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.247176 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dda76265-1c2c-4409-8460-99bc3ab509c6","Type":"ContainerDied","Data":"8db94531ef80ba7234486ee2fa33fa133112c2b769013f0ce9cc6ede377335bb"} Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.247184 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.247664 4848 scope.go:117] "RemoveContainer" containerID="8413d8e7b39ea3e60a00900db0021fc2f01a010ecbc78475875e6f6ff9166990" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.294244 4848 scope.go:117] "RemoveContainer" containerID="2d8388c8c09ab32d2f3f332f8e780dac4da735bd62c80866b31cece9902797ce" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.312239 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.336209 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.349593 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.361290 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.365831 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.366242 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.366275 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.366498 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.366534 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.368443 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.369393 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.371791 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5pvs9" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.478588 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.546891 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.547400 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.547460 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.547503 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.547532 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.547568 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.547615 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.547664 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.547778 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.547914 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.548024 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64gnm\" (UniqueName: \"kubernetes.io/projected/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-kube-api-access-64gnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.650145 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.650235 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64gnm\" (UniqueName: \"kubernetes.io/projected/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-kube-api-access-64gnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.650288 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.650326 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.650351 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.650367 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.650384 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.650402 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.650422 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.650444 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.650472 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.651110 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.652333 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.652674 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.652691 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.652693 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.653510 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.658316 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.658530 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.659213 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.671617 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.676651 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64gnm\" (UniqueName: \"kubernetes.io/projected/0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7-kube-api-access-64gnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.693049 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.977599 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" path="/var/lib/kubelet/pods/b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75/volumes" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.978435 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dda76265-1c2c-4409-8460-99bc3ab509c6" path="/var/lib/kubelet/pods/dda76265-1c2c-4409-8460-99bc3ab509c6/volumes" Dec 06 15:52:34 crc kubenswrapper[4848]: I1206 15:52:34.996236 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:52:35 crc kubenswrapper[4848]: I1206 15:52:35.263251 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d58ead1c-d7f6-4643-a869-8566f5d9843b","Type":"ContainerStarted","Data":"ae2ffe3e15bd0823fb93d1aec7e4b467002ac27511e3d5ddba063fd0dac99fe1"} Dec 06 15:52:35 crc kubenswrapper[4848]: I1206 15:52:35.300349 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 15:52:35 crc kubenswrapper[4848]: W1206 15:52:35.302748 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b4e03ac_e69e_46de_9dd1_5ea4cc56c0c7.slice/crio-6bca2db5c3270c809c1b41d8d4684fc6aa87afdefe7ede6cfb600d065ecfb44a WatchSource:0}: Error finding container 6bca2db5c3270c809c1b41d8d4684fc6aa87afdefe7ede6cfb600d065ecfb44a: Status 404 returned error can't find the container with id 6bca2db5c3270c809c1b41d8d4684fc6aa87afdefe7ede6cfb600d065ecfb44a Dec 06 15:52:36 crc kubenswrapper[4848]: I1206 15:52:36.274488 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7","Type":"ContainerStarted","Data":"6bca2db5c3270c809c1b41d8d4684fc6aa87afdefe7ede6cfb600d065ecfb44a"} Dec 06 15:52:37 crc kubenswrapper[4848]: E1206 15:52:37.101652 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice/crio-d5947b2ef4a7b9b8d80fbec5660243fa3ce2422d61650322fe2daa2aea0d6851\": RecentStats: unable to find data in memory cache]" Dec 06 15:52:37 crc kubenswrapper[4848]: I1206 15:52:37.285130 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d58ead1c-d7f6-4643-a869-8566f5d9843b","Type":"ContainerStarted","Data":"65567d8519983fcdc2c525e7f13886e8fffbe1a0f708f4f43ea32e37a5e9fe07"} Dec 06 15:52:37 crc kubenswrapper[4848]: I1206 15:52:37.286281 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7","Type":"ContainerStarted","Data":"a0a5a00f2dafa1043212144c478a255f2034d03a8686c7832bc78eeb2b8e96b9"} Dec 06 15:52:37 crc kubenswrapper[4848]: I1206 15:52:37.952055 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="b5e9e88c-6b1c-44dc-b00f-f8e4a25c5a75" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: i/o timeout" Dec 06 15:52:47 crc kubenswrapper[4848]: I1206 15:52:47.149751 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:52:47 crc kubenswrapper[4848]: I1206 15:52:47.150253 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:52:47 crc kubenswrapper[4848]: E1206 15:52:47.367153 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice/crio-d5947b2ef4a7b9b8d80fbec5660243fa3ce2422d61650322fe2daa2aea0d6851\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb423f_2775_4ea0_8c6e_d7dc2542e1cc.slice\": RecentStats: unable to find data in memory cache]" Dec 06 15:53:08 crc kubenswrapper[4848]: I1206 15:53:08.613815 4848 generic.go:334] "Generic (PLEG): container finished" podID="d58ead1c-d7f6-4643-a869-8566f5d9843b" containerID="65567d8519983fcdc2c525e7f13886e8fffbe1a0f708f4f43ea32e37a5e9fe07" exitCode=0 Dec 06 15:53:08 crc kubenswrapper[4848]: I1206 15:53:08.613924 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d58ead1c-d7f6-4643-a869-8566f5d9843b","Type":"ContainerDied","Data":"65567d8519983fcdc2c525e7f13886e8fffbe1a0f708f4f43ea32e37a5e9fe07"} Dec 06 15:53:09 crc kubenswrapper[4848]: I1206 15:53:09.624796 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d58ead1c-d7f6-4643-a869-8566f5d9843b","Type":"ContainerStarted","Data":"c93074b2b7fda47e8f031bf80f393bf849dbf338cddad454db0739868f1e671f"} Dec 06 15:53:09 crc kubenswrapper[4848]: I1206 15:53:09.625623 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 06 15:53:09 crc kubenswrapper[4848]: I1206 15:53:09.628385 4848 generic.go:334] "Generic (PLEG): container finished" podID="0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7" containerID="a0a5a00f2dafa1043212144c478a255f2034d03a8686c7832bc78eeb2b8e96b9" exitCode=0 Dec 06 15:53:09 crc kubenswrapper[4848]: I1206 15:53:09.628439 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7","Type":"ContainerDied","Data":"a0a5a00f2dafa1043212144c478a255f2034d03a8686c7832bc78eeb2b8e96b9"} Dec 06 15:53:09 crc kubenswrapper[4848]: I1206 15:53:09.652176 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.652159099 podStartE2EDuration="36.652159099s" podCreationTimestamp="2025-12-06 15:52:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:53:09.645661155 +0000 UTC m=+1456.943672078" watchObservedRunningTime="2025-12-06 15:53:09.652159099 +0000 UTC m=+1456.950170012" Dec 06 15:53:10 crc kubenswrapper[4848]: I1206 15:53:10.643820 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7","Type":"ContainerStarted","Data":"0983ecd20241afdddb1605b0cc553e7662b25bce3dba9a23cb78bb34907e3a38"} Dec 06 15:53:10 crc kubenswrapper[4848]: I1206 15:53:10.644216 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:53:10 crc kubenswrapper[4848]: I1206 15:53:10.671386 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.671369978 podStartE2EDuration="36.671369978s" podCreationTimestamp="2025-12-06 15:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 15:53:10.668140642 +0000 UTC m=+1457.966151565" watchObservedRunningTime="2025-12-06 15:53:10.671369978 +0000 UTC m=+1457.969380891" Dec 06 15:53:17 crc kubenswrapper[4848]: I1206 15:53:17.150570 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:53:17 crc kubenswrapper[4848]: I1206 15:53:17.151206 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:53:21 crc kubenswrapper[4848]: I1206 15:53:21.109578 4848 scope.go:117] "RemoveContainer" containerID="d51bf060384adb2b2a7a5f940a6a8aa6f2f3d4cfe023bc723e810f2b1aedf521" Dec 06 15:53:23 crc kubenswrapper[4848]: I1206 15:53:23.979909 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 06 15:53:25 crc kubenswrapper[4848]: I1206 15:53:25.001886 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.149564 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j7k7c"] Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.155671 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.160222 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-catalog-content\") pod \"certified-operators-j7k7c\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.160315 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfwc9\" (UniqueName: \"kubernetes.io/projected/d70e8961-932b-411f-8d68-a5d580a50dea-kube-api-access-mfwc9\") pod \"certified-operators-j7k7c\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.160349 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-utilities\") pod \"certified-operators-j7k7c\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.160920 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j7k7c"] Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.262498 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-catalog-content\") pod \"certified-operators-j7k7c\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.262574 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfwc9\" (UniqueName: \"kubernetes.io/projected/d70e8961-932b-411f-8d68-a5d580a50dea-kube-api-access-mfwc9\") pod \"certified-operators-j7k7c\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.262603 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-utilities\") pod \"certified-operators-j7k7c\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.263115 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-utilities\") pod \"certified-operators-j7k7c\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.263377 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-catalog-content\") pod \"certified-operators-j7k7c\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.287758 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfwc9\" (UniqueName: \"kubernetes.io/projected/d70e8961-932b-411f-8d68-a5d580a50dea-kube-api-access-mfwc9\") pod \"certified-operators-j7k7c\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:32 crc kubenswrapper[4848]: I1206 15:53:32.483042 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:33 crc kubenswrapper[4848]: I1206 15:53:33.001895 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j7k7c"] Dec 06 15:53:33 crc kubenswrapper[4848]: I1206 15:53:33.851190 4848 generic.go:334] "Generic (PLEG): container finished" podID="d70e8961-932b-411f-8d68-a5d580a50dea" containerID="46c6e86c600fbe2fc68335fc032a459cf6474f1dfdd643befc0a90cb8f7fe03e" exitCode=0 Dec 06 15:53:33 crc kubenswrapper[4848]: I1206 15:53:33.851241 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7k7c" event={"ID":"d70e8961-932b-411f-8d68-a5d580a50dea","Type":"ContainerDied","Data":"46c6e86c600fbe2fc68335fc032a459cf6474f1dfdd643befc0a90cb8f7fe03e"} Dec 06 15:53:33 crc kubenswrapper[4848]: I1206 15:53:33.851492 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7k7c" event={"ID":"d70e8961-932b-411f-8d68-a5d580a50dea","Type":"ContainerStarted","Data":"6d17f8bf6de73925b158f95ed7398583f38888b1e5411ee3e71cae89623a9747"} Dec 06 15:53:35 crc kubenswrapper[4848]: I1206 15:53:35.869112 4848 generic.go:334] "Generic (PLEG): container finished" podID="d70e8961-932b-411f-8d68-a5d580a50dea" containerID="291a8814a1e3d2aac58a7a8bdc57264e32b82ab6c434b1b04b5e4a12620960b8" exitCode=0 Dec 06 15:53:35 crc kubenswrapper[4848]: I1206 15:53:35.869225 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7k7c" event={"ID":"d70e8961-932b-411f-8d68-a5d580a50dea","Type":"ContainerDied","Data":"291a8814a1e3d2aac58a7a8bdc57264e32b82ab6c434b1b04b5e4a12620960b8"} Dec 06 15:53:36 crc kubenswrapper[4848]: I1206 15:53:36.880290 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7k7c" event={"ID":"d70e8961-932b-411f-8d68-a5d580a50dea","Type":"ContainerStarted","Data":"03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1"} Dec 06 15:53:36 crc kubenswrapper[4848]: I1206 15:53:36.901529 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j7k7c" podStartSLOduration=2.451724796 podStartE2EDuration="4.901512373s" podCreationTimestamp="2025-12-06 15:53:32 +0000 UTC" firstStartedPulling="2025-12-06 15:53:33.853168255 +0000 UTC m=+1481.151179168" lastFinishedPulling="2025-12-06 15:53:36.302955832 +0000 UTC m=+1483.600966745" observedRunningTime="2025-12-06 15:53:36.896993222 +0000 UTC m=+1484.195004135" watchObservedRunningTime="2025-12-06 15:53:36.901512373 +0000 UTC m=+1484.199523286" Dec 06 15:53:42 crc kubenswrapper[4848]: I1206 15:53:42.484279 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:42 crc kubenswrapper[4848]: I1206 15:53:42.485883 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:42 crc kubenswrapper[4848]: I1206 15:53:42.532151 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:42 crc kubenswrapper[4848]: I1206 15:53:42.984787 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:43 crc kubenswrapper[4848]: I1206 15:53:43.030626 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j7k7c"] Dec 06 15:53:44 crc kubenswrapper[4848]: I1206 15:53:44.943285 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j7k7c" podUID="d70e8961-932b-411f-8d68-a5d580a50dea" containerName="registry-server" containerID="cri-o://03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1" gracePeriod=2 Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.395380 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.537522 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-utilities\") pod \"d70e8961-932b-411f-8d68-a5d580a50dea\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.537854 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfwc9\" (UniqueName: \"kubernetes.io/projected/d70e8961-932b-411f-8d68-a5d580a50dea-kube-api-access-mfwc9\") pod \"d70e8961-932b-411f-8d68-a5d580a50dea\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.538005 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-catalog-content\") pod \"d70e8961-932b-411f-8d68-a5d580a50dea\" (UID: \"d70e8961-932b-411f-8d68-a5d580a50dea\") " Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.538458 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-utilities" (OuterVolumeSpecName: "utilities") pod "d70e8961-932b-411f-8d68-a5d580a50dea" (UID: "d70e8961-932b-411f-8d68-a5d580a50dea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.543553 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d70e8961-932b-411f-8d68-a5d580a50dea-kube-api-access-mfwc9" (OuterVolumeSpecName: "kube-api-access-mfwc9") pod "d70e8961-932b-411f-8d68-a5d580a50dea" (UID: "d70e8961-932b-411f-8d68-a5d580a50dea"). InnerVolumeSpecName "kube-api-access-mfwc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.596991 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d70e8961-932b-411f-8d68-a5d580a50dea" (UID: "d70e8961-932b-411f-8d68-a5d580a50dea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.640079 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.640119 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfwc9\" (UniqueName: \"kubernetes.io/projected/d70e8961-932b-411f-8d68-a5d580a50dea-kube-api-access-mfwc9\") on node \"crc\" DevicePath \"\"" Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.640130 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d70e8961-932b-411f-8d68-a5d580a50dea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.955482 4848 generic.go:334] "Generic (PLEG): container finished" podID="d70e8961-932b-411f-8d68-a5d580a50dea" containerID="03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1" exitCode=0 Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.955539 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7k7c" event={"ID":"d70e8961-932b-411f-8d68-a5d580a50dea","Type":"ContainerDied","Data":"03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1"} Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.955571 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7k7c" Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.955614 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7k7c" event={"ID":"d70e8961-932b-411f-8d68-a5d580a50dea","Type":"ContainerDied","Data":"6d17f8bf6de73925b158f95ed7398583f38888b1e5411ee3e71cae89623a9747"} Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.955640 4848 scope.go:117] "RemoveContainer" containerID="03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1" Dec 06 15:53:45 crc kubenswrapper[4848]: I1206 15:53:45.985244 4848 scope.go:117] "RemoveContainer" containerID="291a8814a1e3d2aac58a7a8bdc57264e32b82ab6c434b1b04b5e4a12620960b8" Dec 06 15:53:46 crc kubenswrapper[4848]: I1206 15:53:46.008517 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j7k7c"] Dec 06 15:53:46 crc kubenswrapper[4848]: I1206 15:53:46.016193 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j7k7c"] Dec 06 15:53:46 crc kubenswrapper[4848]: I1206 15:53:46.033136 4848 scope.go:117] "RemoveContainer" containerID="46c6e86c600fbe2fc68335fc032a459cf6474f1dfdd643befc0a90cb8f7fe03e" Dec 06 15:53:46 crc kubenswrapper[4848]: I1206 15:53:46.074324 4848 scope.go:117] "RemoveContainer" containerID="03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1" Dec 06 15:53:46 crc kubenswrapper[4848]: E1206 15:53:46.074758 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1\": container with ID starting with 03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1 not found: ID does not exist" containerID="03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1" Dec 06 15:53:46 crc kubenswrapper[4848]: I1206 15:53:46.074800 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1"} err="failed to get container status \"03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1\": rpc error: code = NotFound desc = could not find container \"03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1\": container with ID starting with 03d82c5a66740d9f6893640c00b10967c7333c735ba252c8cd6d3c3237de8ed1 not found: ID does not exist" Dec 06 15:53:46 crc kubenswrapper[4848]: I1206 15:53:46.074825 4848 scope.go:117] "RemoveContainer" containerID="291a8814a1e3d2aac58a7a8bdc57264e32b82ab6c434b1b04b5e4a12620960b8" Dec 06 15:53:46 crc kubenswrapper[4848]: E1206 15:53:46.075238 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"291a8814a1e3d2aac58a7a8bdc57264e32b82ab6c434b1b04b5e4a12620960b8\": container with ID starting with 291a8814a1e3d2aac58a7a8bdc57264e32b82ab6c434b1b04b5e4a12620960b8 not found: ID does not exist" containerID="291a8814a1e3d2aac58a7a8bdc57264e32b82ab6c434b1b04b5e4a12620960b8" Dec 06 15:53:46 crc kubenswrapper[4848]: I1206 15:53:46.075290 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"291a8814a1e3d2aac58a7a8bdc57264e32b82ab6c434b1b04b5e4a12620960b8"} err="failed to get container status \"291a8814a1e3d2aac58a7a8bdc57264e32b82ab6c434b1b04b5e4a12620960b8\": rpc error: code = NotFound desc = could not find container \"291a8814a1e3d2aac58a7a8bdc57264e32b82ab6c434b1b04b5e4a12620960b8\": container with ID starting with 291a8814a1e3d2aac58a7a8bdc57264e32b82ab6c434b1b04b5e4a12620960b8 not found: ID does not exist" Dec 06 15:53:46 crc kubenswrapper[4848]: I1206 15:53:46.075326 4848 scope.go:117] "RemoveContainer" containerID="46c6e86c600fbe2fc68335fc032a459cf6474f1dfdd643befc0a90cb8f7fe03e" Dec 06 15:53:46 crc kubenswrapper[4848]: E1206 15:53:46.075737 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c6e86c600fbe2fc68335fc032a459cf6474f1dfdd643befc0a90cb8f7fe03e\": container with ID starting with 46c6e86c600fbe2fc68335fc032a459cf6474f1dfdd643befc0a90cb8f7fe03e not found: ID does not exist" containerID="46c6e86c600fbe2fc68335fc032a459cf6474f1dfdd643befc0a90cb8f7fe03e" Dec 06 15:53:46 crc kubenswrapper[4848]: I1206 15:53:46.075779 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c6e86c600fbe2fc68335fc032a459cf6474f1dfdd643befc0a90cb8f7fe03e"} err="failed to get container status \"46c6e86c600fbe2fc68335fc032a459cf6474f1dfdd643befc0a90cb8f7fe03e\": rpc error: code = NotFound desc = could not find container \"46c6e86c600fbe2fc68335fc032a459cf6474f1dfdd643befc0a90cb8f7fe03e\": container with ID starting with 46c6e86c600fbe2fc68335fc032a459cf6474f1dfdd643befc0a90cb8f7fe03e not found: ID does not exist" Dec 06 15:53:46 crc kubenswrapper[4848]: I1206 15:53:46.983418 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d70e8961-932b-411f-8d68-a5d580a50dea" path="/var/lib/kubelet/pods/d70e8961-932b-411f-8d68-a5d580a50dea/volumes" Dec 06 15:53:47 crc kubenswrapper[4848]: I1206 15:53:47.150730 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 15:53:47 crc kubenswrapper[4848]: I1206 15:53:47.150802 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 15:53:47 crc kubenswrapper[4848]: I1206 15:53:47.150851 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 15:53:47 crc kubenswrapper[4848]: I1206 15:53:47.151602 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8"} pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 15:53:47 crc kubenswrapper[4848]: I1206 15:53:47.151662 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" containerID="cri-o://2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" gracePeriod=600 Dec 06 15:53:47 crc kubenswrapper[4848]: E1206 15:53:47.286436 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:53:47 crc kubenswrapper[4848]: I1206 15:53:47.990484 4848 generic.go:334] "Generic (PLEG): container finished" podID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" exitCode=0 Dec 06 15:53:47 crc kubenswrapper[4848]: I1206 15:53:47.991548 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerDied","Data":"2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8"} Dec 06 15:53:47 crc kubenswrapper[4848]: I1206 15:53:47.991725 4848 scope.go:117] "RemoveContainer" containerID="1a1d1fbb58852277f10718bb790d5a1cff7eb412840195878f28ff1bcf501416" Dec 06 15:53:47 crc kubenswrapper[4848]: I1206 15:53:47.992366 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:53:47 crc kubenswrapper[4848]: E1206 15:53:47.992654 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:54:01 crc kubenswrapper[4848]: I1206 15:54:01.967312 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:54:01 crc kubenswrapper[4848]: E1206 15:54:01.968059 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:54:12 crc kubenswrapper[4848]: I1206 15:54:12.818256 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5mnfs"] Dec 06 15:54:12 crc kubenswrapper[4848]: E1206 15:54:12.819279 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70e8961-932b-411f-8d68-a5d580a50dea" containerName="registry-server" Dec 06 15:54:12 crc kubenswrapper[4848]: I1206 15:54:12.819296 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70e8961-932b-411f-8d68-a5d580a50dea" containerName="registry-server" Dec 06 15:54:12 crc kubenswrapper[4848]: E1206 15:54:12.819310 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70e8961-932b-411f-8d68-a5d580a50dea" containerName="extract-utilities" Dec 06 15:54:12 crc kubenswrapper[4848]: I1206 15:54:12.819317 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70e8961-932b-411f-8d68-a5d580a50dea" containerName="extract-utilities" Dec 06 15:54:12 crc kubenswrapper[4848]: E1206 15:54:12.819361 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70e8961-932b-411f-8d68-a5d580a50dea" containerName="extract-content" Dec 06 15:54:12 crc kubenswrapper[4848]: I1206 15:54:12.819369 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70e8961-932b-411f-8d68-a5d580a50dea" containerName="extract-content" Dec 06 15:54:12 crc kubenswrapper[4848]: I1206 15:54:12.819579 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="d70e8961-932b-411f-8d68-a5d580a50dea" containerName="registry-server" Dec 06 15:54:12 crc kubenswrapper[4848]: I1206 15:54:12.821037 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:12 crc kubenswrapper[4848]: I1206 15:54:12.842924 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mnfs"] Dec 06 15:54:12 crc kubenswrapper[4848]: I1206 15:54:12.972129 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhtpb\" (UniqueName: \"kubernetes.io/projected/2daea31a-ee40-4b8f-bdb0-033f0196a154-kube-api-access-nhtpb\") pod \"redhat-marketplace-5mnfs\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:12 crc kubenswrapper[4848]: I1206 15:54:12.972232 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-catalog-content\") pod \"redhat-marketplace-5mnfs\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:12 crc kubenswrapper[4848]: I1206 15:54:12.972263 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-utilities\") pod \"redhat-marketplace-5mnfs\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:12 crc kubenswrapper[4848]: I1206 15:54:12.972777 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:54:12 crc kubenswrapper[4848]: E1206 15:54:12.973018 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:54:13 crc kubenswrapper[4848]: I1206 15:54:13.074345 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhtpb\" (UniqueName: \"kubernetes.io/projected/2daea31a-ee40-4b8f-bdb0-033f0196a154-kube-api-access-nhtpb\") pod \"redhat-marketplace-5mnfs\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:13 crc kubenswrapper[4848]: I1206 15:54:13.074548 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-catalog-content\") pod \"redhat-marketplace-5mnfs\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:13 crc kubenswrapper[4848]: I1206 15:54:13.074608 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-utilities\") pod \"redhat-marketplace-5mnfs\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:13 crc kubenswrapper[4848]: I1206 15:54:13.075459 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-utilities\") pod \"redhat-marketplace-5mnfs\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:13 crc kubenswrapper[4848]: I1206 15:54:13.075585 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-catalog-content\") pod \"redhat-marketplace-5mnfs\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:13 crc kubenswrapper[4848]: I1206 15:54:13.098616 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhtpb\" (UniqueName: \"kubernetes.io/projected/2daea31a-ee40-4b8f-bdb0-033f0196a154-kube-api-access-nhtpb\") pod \"redhat-marketplace-5mnfs\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:13 crc kubenswrapper[4848]: I1206 15:54:13.146297 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:13 crc kubenswrapper[4848]: I1206 15:54:13.621251 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mnfs"] Dec 06 15:54:14 crc kubenswrapper[4848]: I1206 15:54:14.298260 4848 generic.go:334] "Generic (PLEG): container finished" podID="2daea31a-ee40-4b8f-bdb0-033f0196a154" containerID="473cf4d6cae340d21e704e668800a3718d3b897bc404fabfd2d5b4b96e53f2a3" exitCode=0 Dec 06 15:54:14 crc kubenswrapper[4848]: I1206 15:54:14.298447 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mnfs" event={"ID":"2daea31a-ee40-4b8f-bdb0-033f0196a154","Type":"ContainerDied","Data":"473cf4d6cae340d21e704e668800a3718d3b897bc404fabfd2d5b4b96e53f2a3"} Dec 06 15:54:14 crc kubenswrapper[4848]: I1206 15:54:14.298558 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mnfs" event={"ID":"2daea31a-ee40-4b8f-bdb0-033f0196a154","Type":"ContainerStarted","Data":"d8fd0307f6d4bb0bbb6cb031c166a068048dd94faef2312e4485aeaae159b308"} Dec 06 15:54:15 crc kubenswrapper[4848]: I1206 15:54:15.309122 4848 generic.go:334] "Generic (PLEG): container finished" podID="2daea31a-ee40-4b8f-bdb0-033f0196a154" containerID="9164b9ea9951a36da9229c97799f9d1967701063b1ea123573d47f5d0e3f0f98" exitCode=0 Dec 06 15:54:15 crc kubenswrapper[4848]: I1206 15:54:15.309237 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mnfs" event={"ID":"2daea31a-ee40-4b8f-bdb0-033f0196a154","Type":"ContainerDied","Data":"9164b9ea9951a36da9229c97799f9d1967701063b1ea123573d47f5d0e3f0f98"} Dec 06 15:54:16 crc kubenswrapper[4848]: I1206 15:54:16.321011 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mnfs" event={"ID":"2daea31a-ee40-4b8f-bdb0-033f0196a154","Type":"ContainerStarted","Data":"9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9"} Dec 06 15:54:16 crc kubenswrapper[4848]: I1206 15:54:16.349370 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5mnfs" podStartSLOduration=2.944158088 podStartE2EDuration="4.349349133s" podCreationTimestamp="2025-12-06 15:54:12 +0000 UTC" firstStartedPulling="2025-12-06 15:54:14.300210088 +0000 UTC m=+1521.598221001" lastFinishedPulling="2025-12-06 15:54:15.705401133 +0000 UTC m=+1523.003412046" observedRunningTime="2025-12-06 15:54:16.346914927 +0000 UTC m=+1523.644925850" watchObservedRunningTime="2025-12-06 15:54:16.349349133 +0000 UTC m=+1523.647360046" Dec 06 15:54:21 crc kubenswrapper[4848]: I1206 15:54:21.212287 4848 scope.go:117] "RemoveContainer" containerID="39483f16e27cf26851cbe8c6f30260590a5b9f2670add6ad8922d4fd4ba4bd0f" Dec 06 15:54:23 crc kubenswrapper[4848]: I1206 15:54:23.147411 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:23 crc kubenswrapper[4848]: I1206 15:54:23.147930 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:23 crc kubenswrapper[4848]: I1206 15:54:23.194533 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:23 crc kubenswrapper[4848]: I1206 15:54:23.432567 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:23 crc kubenswrapper[4848]: I1206 15:54:23.481094 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mnfs"] Dec 06 15:54:24 crc kubenswrapper[4848]: I1206 15:54:24.966892 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:54:24 crc kubenswrapper[4848]: E1206 15:54:24.967546 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:54:25 crc kubenswrapper[4848]: I1206 15:54:25.398882 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5mnfs" podUID="2daea31a-ee40-4b8f-bdb0-033f0196a154" containerName="registry-server" containerID="cri-o://9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9" gracePeriod=2 Dec 06 15:54:25 crc kubenswrapper[4848]: I1206 15:54:25.853492 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:25 crc kubenswrapper[4848]: I1206 15:54:25.877328 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhtpb\" (UniqueName: \"kubernetes.io/projected/2daea31a-ee40-4b8f-bdb0-033f0196a154-kube-api-access-nhtpb\") pod \"2daea31a-ee40-4b8f-bdb0-033f0196a154\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " Dec 06 15:54:25 crc kubenswrapper[4848]: I1206 15:54:25.877453 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-catalog-content\") pod \"2daea31a-ee40-4b8f-bdb0-033f0196a154\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " Dec 06 15:54:25 crc kubenswrapper[4848]: I1206 15:54:25.877488 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-utilities\") pod \"2daea31a-ee40-4b8f-bdb0-033f0196a154\" (UID: \"2daea31a-ee40-4b8f-bdb0-033f0196a154\") " Dec 06 15:54:25 crc kubenswrapper[4848]: I1206 15:54:25.879059 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-utilities" (OuterVolumeSpecName: "utilities") pod "2daea31a-ee40-4b8f-bdb0-033f0196a154" (UID: "2daea31a-ee40-4b8f-bdb0-033f0196a154"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:54:25 crc kubenswrapper[4848]: I1206 15:54:25.884037 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2daea31a-ee40-4b8f-bdb0-033f0196a154-kube-api-access-nhtpb" (OuterVolumeSpecName: "kube-api-access-nhtpb") pod "2daea31a-ee40-4b8f-bdb0-033f0196a154" (UID: "2daea31a-ee40-4b8f-bdb0-033f0196a154"). InnerVolumeSpecName "kube-api-access-nhtpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:54:25 crc kubenswrapper[4848]: I1206 15:54:25.902204 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2daea31a-ee40-4b8f-bdb0-033f0196a154" (UID: "2daea31a-ee40-4b8f-bdb0-033f0196a154"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:54:25 crc kubenswrapper[4848]: I1206 15:54:25.980122 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhtpb\" (UniqueName: \"kubernetes.io/projected/2daea31a-ee40-4b8f-bdb0-033f0196a154-kube-api-access-nhtpb\") on node \"crc\" DevicePath \"\"" Dec 06 15:54:25 crc kubenswrapper[4848]: I1206 15:54:25.980165 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:54:25 crc kubenswrapper[4848]: I1206 15:54:25.980181 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2daea31a-ee40-4b8f-bdb0-033f0196a154-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.410881 4848 generic.go:334] "Generic (PLEG): container finished" podID="2daea31a-ee40-4b8f-bdb0-033f0196a154" containerID="9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9" exitCode=0 Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.411057 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mnfs" event={"ID":"2daea31a-ee40-4b8f-bdb0-033f0196a154","Type":"ContainerDied","Data":"9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9"} Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.411316 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mnfs" event={"ID":"2daea31a-ee40-4b8f-bdb0-033f0196a154","Type":"ContainerDied","Data":"d8fd0307f6d4bb0bbb6cb031c166a068048dd94faef2312e4485aeaae159b308"} Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.411347 4848 scope.go:117] "RemoveContainer" containerID="9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9" Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.411136 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mnfs" Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.446236 4848 scope.go:117] "RemoveContainer" containerID="9164b9ea9951a36da9229c97799f9d1967701063b1ea123573d47f5d0e3f0f98" Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.477318 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mnfs"] Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.492509 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mnfs"] Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.494522 4848 scope.go:117] "RemoveContainer" containerID="473cf4d6cae340d21e704e668800a3718d3b897bc404fabfd2d5b4b96e53f2a3" Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.525918 4848 scope.go:117] "RemoveContainer" containerID="9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9" Dec 06 15:54:26 crc kubenswrapper[4848]: E1206 15:54:26.526460 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9\": container with ID starting with 9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9 not found: ID does not exist" containerID="9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9" Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.526521 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9"} err="failed to get container status \"9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9\": rpc error: code = NotFound desc = could not find container \"9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9\": container with ID starting with 9c03b961e590aae1e41159a523b68ae5239e08c6957eefb0d76f19a3907e93e9 not found: ID does not exist" Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.526555 4848 scope.go:117] "RemoveContainer" containerID="9164b9ea9951a36da9229c97799f9d1967701063b1ea123573d47f5d0e3f0f98" Dec 06 15:54:26 crc kubenswrapper[4848]: E1206 15:54:26.527086 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9164b9ea9951a36da9229c97799f9d1967701063b1ea123573d47f5d0e3f0f98\": container with ID starting with 9164b9ea9951a36da9229c97799f9d1967701063b1ea123573d47f5d0e3f0f98 not found: ID does not exist" containerID="9164b9ea9951a36da9229c97799f9d1967701063b1ea123573d47f5d0e3f0f98" Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.527127 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9164b9ea9951a36da9229c97799f9d1967701063b1ea123573d47f5d0e3f0f98"} err="failed to get container status \"9164b9ea9951a36da9229c97799f9d1967701063b1ea123573d47f5d0e3f0f98\": rpc error: code = NotFound desc = could not find container \"9164b9ea9951a36da9229c97799f9d1967701063b1ea123573d47f5d0e3f0f98\": container with ID starting with 9164b9ea9951a36da9229c97799f9d1967701063b1ea123573d47f5d0e3f0f98 not found: ID does not exist" Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.527157 4848 scope.go:117] "RemoveContainer" containerID="473cf4d6cae340d21e704e668800a3718d3b897bc404fabfd2d5b4b96e53f2a3" Dec 06 15:54:26 crc kubenswrapper[4848]: E1206 15:54:26.527503 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"473cf4d6cae340d21e704e668800a3718d3b897bc404fabfd2d5b4b96e53f2a3\": container with ID starting with 473cf4d6cae340d21e704e668800a3718d3b897bc404fabfd2d5b4b96e53f2a3 not found: ID does not exist" containerID="473cf4d6cae340d21e704e668800a3718d3b897bc404fabfd2d5b4b96e53f2a3" Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.527542 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"473cf4d6cae340d21e704e668800a3718d3b897bc404fabfd2d5b4b96e53f2a3"} err="failed to get container status \"473cf4d6cae340d21e704e668800a3718d3b897bc404fabfd2d5b4b96e53f2a3\": rpc error: code = NotFound desc = could not find container \"473cf4d6cae340d21e704e668800a3718d3b897bc404fabfd2d5b4b96e53f2a3\": container with ID starting with 473cf4d6cae340d21e704e668800a3718d3b897bc404fabfd2d5b4b96e53f2a3 not found: ID does not exist" Dec 06 15:54:26 crc kubenswrapper[4848]: I1206 15:54:26.979200 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2daea31a-ee40-4b8f-bdb0-033f0196a154" path="/var/lib/kubelet/pods/2daea31a-ee40-4b8f-bdb0-033f0196a154/volumes" Dec 06 15:54:39 crc kubenswrapper[4848]: I1206 15:54:39.966679 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:54:39 crc kubenswrapper[4848]: E1206 15:54:39.967439 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:54:51 crc kubenswrapper[4848]: I1206 15:54:51.967073 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:54:51 crc kubenswrapper[4848]: E1206 15:54:51.967822 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:55:06 crc kubenswrapper[4848]: I1206 15:55:06.966844 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:55:06 crc kubenswrapper[4848]: E1206 15:55:06.968619 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:55:18 crc kubenswrapper[4848]: I1206 15:55:18.966122 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:55:18 crc kubenswrapper[4848]: E1206 15:55:18.966913 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:55:21 crc kubenswrapper[4848]: I1206 15:55:21.303878 4848 scope.go:117] "RemoveContainer" containerID="e56f0b5bbe6fd90187a3c4bf4a60a370762d5f91ff34a66306c98d05543a6f5a" Dec 06 15:55:21 crc kubenswrapper[4848]: I1206 15:55:21.331014 4848 scope.go:117] "RemoveContainer" containerID="b4bcb07f110ef58ee399698e916213a65c6de902e07a82a2db411be5fc9d05b8" Dec 06 15:55:33 crc kubenswrapper[4848]: I1206 15:55:33.966509 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:55:33 crc kubenswrapper[4848]: E1206 15:55:33.968286 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:55:44 crc kubenswrapper[4848]: I1206 15:55:44.966006 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:55:44 crc kubenswrapper[4848]: E1206 15:55:44.966850 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:55:59 crc kubenswrapper[4848]: I1206 15:55:59.966671 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:55:59 crc kubenswrapper[4848]: E1206 15:55:59.967424 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:56:12 crc kubenswrapper[4848]: I1206 15:56:12.972686 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:56:12 crc kubenswrapper[4848]: E1206 15:56:12.973466 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:56:27 crc kubenswrapper[4848]: I1206 15:56:27.966906 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:56:27 crc kubenswrapper[4848]: E1206 15:56:27.968766 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:56:38 crc kubenswrapper[4848]: I1206 15:56:38.966978 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:56:38 crc kubenswrapper[4848]: E1206 15:56:38.967926 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:56:51 crc kubenswrapper[4848]: I1206 15:56:51.966344 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:56:51 crc kubenswrapper[4848]: E1206 15:56:51.967144 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.046038 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-s8cj2"] Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.061566 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1ca1-account-create-update-s49tr"] Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.071754 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-7488f"] Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.079567 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-m2c2j"] Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.087893 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1ca1-account-create-update-s49tr"] Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.095185 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-7488f"] Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.104130 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-m2c2j"] Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.112335 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-s8cj2"] Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.976990 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5534716d-e2f0-4e26-825c-8468b6832159" path="/var/lib/kubelet/pods/5534716d-e2f0-4e26-825c-8468b6832159/volumes" Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.977740 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e395dab1-bcea-4e77-8de4-cbd303e30216" path="/var/lib/kubelet/pods/e395dab1-bcea-4e77-8de4-cbd303e30216/volumes" Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.978275 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c835a3-f7fd-439d-8839-f2ebc7923899" path="/var/lib/kubelet/pods/f9c835a3-f7fd-439d-8839-f2ebc7923899/volumes" Dec 06 15:57:00 crc kubenswrapper[4848]: I1206 15:57:00.978840 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa122601-0140-4956-9ff0-2ad0e05ea423" path="/var/lib/kubelet/pods/fa122601-0140-4956-9ff0-2ad0e05ea423/volumes" Dec 06 15:57:01 crc kubenswrapper[4848]: I1206 15:57:01.031925 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-82ca-account-create-update-4szsh"] Dec 06 15:57:01 crc kubenswrapper[4848]: I1206 15:57:01.045983 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1198-account-create-update-bbblr"] Dec 06 15:57:01 crc kubenswrapper[4848]: I1206 15:57:01.054313 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-82ca-account-create-update-4szsh"] Dec 06 15:57:01 crc kubenswrapper[4848]: I1206 15:57:01.062181 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1198-account-create-update-bbblr"] Dec 06 15:57:02 crc kubenswrapper[4848]: I1206 15:57:02.977488 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3934c7ad-2ee0-4ada-883f-3fceb84c8195" path="/var/lib/kubelet/pods/3934c7ad-2ee0-4ada-883f-3fceb84c8195/volumes" Dec 06 15:57:02 crc kubenswrapper[4848]: I1206 15:57:02.978443 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2bccf8-b293-40e5-a41b-78ec5826fc22" path="/var/lib/kubelet/pods/7e2bccf8-b293-40e5-a41b-78ec5826fc22/volumes" Dec 06 15:57:06 crc kubenswrapper[4848]: I1206 15:57:06.966969 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:57:06 crc kubenswrapper[4848]: E1206 15:57:06.967778 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:57:21 crc kubenswrapper[4848]: I1206 15:57:21.440952 4848 scope.go:117] "RemoveContainer" containerID="c7d3357e58d3f88dc9c3baf4c09ecd630ddf3325b0938d3885a84bbb7377b05c" Dec 06 15:57:21 crc kubenswrapper[4848]: I1206 15:57:21.475743 4848 scope.go:117] "RemoveContainer" containerID="927a8c35447afa89cb0ac1a704737b3ed8d6c4a4dd0da516239a3a516f50fc6d" Dec 06 15:57:21 crc kubenswrapper[4848]: I1206 15:57:21.526193 4848 scope.go:117] "RemoveContainer" containerID="24779547ebbefe88f1fc20baa7693447c72cb98f08c1c558b418cf1b3a3dbe0f" Dec 06 15:57:21 crc kubenswrapper[4848]: I1206 15:57:21.570456 4848 scope.go:117] "RemoveContainer" containerID="d826aca8307b94dd27c38288b71ba7b4e0a69a465d297665c0f60ab60cbdf4b1" Dec 06 15:57:21 crc kubenswrapper[4848]: I1206 15:57:21.615558 4848 scope.go:117] "RemoveContainer" containerID="d13c0676bddb16e11c0e1e2cfa6661d8152296db2fa6ab094e60a3d25107ed53" Dec 06 15:57:21 crc kubenswrapper[4848]: I1206 15:57:21.640811 4848 scope.go:117] "RemoveContainer" containerID="54ae842e030a67d597612828948dd8f3ca668a5a9df7064fa50b634643ddf678" Dec 06 15:57:21 crc kubenswrapper[4848]: I1206 15:57:21.670598 4848 scope.go:117] "RemoveContainer" containerID="3dfd060ec291ddb4e4c193a2c4938ce1b8bcb920db2bbfd4871c7cf5244c621d" Dec 06 15:57:21 crc kubenswrapper[4848]: I1206 15:57:21.710273 4848 scope.go:117] "RemoveContainer" containerID="67dec1c86133e33f09aacad77b09e2e65dbd99afe333541242906ec4bad2ee44" Dec 06 15:57:21 crc kubenswrapper[4848]: I1206 15:57:21.966663 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:57:21 crc kubenswrapper[4848]: E1206 15:57:21.967133 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.052935 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-89hsl"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.070423 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3c32-account-create-update-tkb9x"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.081941 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-vj76x"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.095926 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-89hsl"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.106676 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-vj76x"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.117269 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3c32-account-create-update-tkb9x"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.128669 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3f66-account-create-update-rnd8v"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.139448 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8861-account-create-update-9w8cq"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.150949 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6wzhg"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.161971 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3f66-account-create-update-rnd8v"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.170228 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6wzhg"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.180584 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8861-account-create-update-9w8cq"] Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.978921 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f2ab4c-f787-4a9c-acce-12e31518edcc" path="/var/lib/kubelet/pods/19f2ab4c-f787-4a9c-acce-12e31518edcc/volumes" Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.979490 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="242c04cd-53ea-4612-bc7d-e7e7d05700ed" path="/var/lib/kubelet/pods/242c04cd-53ea-4612-bc7d-e7e7d05700ed/volumes" Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.980014 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58460aea-6f24-4346-b46d-7e0a7e0e0eca" path="/var/lib/kubelet/pods/58460aea-6f24-4346-b46d-7e0a7e0e0eca/volumes" Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.980505 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eae0c0b-0b9e-4fc7-b9b6-21b14900096d" path="/var/lib/kubelet/pods/6eae0c0b-0b9e-4fc7-b9b6-21b14900096d/volumes" Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.981462 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd" path="/var/lib/kubelet/pods/a1ab4bb7-ef7d-44ab-a5b4-5af92e5fdcbd/volumes" Dec 06 15:57:24 crc kubenswrapper[4848]: I1206 15:57:24.982007 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce4a5fd6-456b-4eb4-973a-a9cf690e9be8" path="/var/lib/kubelet/pods/ce4a5fd6-456b-4eb4-973a-a9cf690e9be8/volumes" Dec 06 15:57:33 crc kubenswrapper[4848]: I1206 15:57:33.967401 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:57:33 crc kubenswrapper[4848]: E1206 15:57:33.968486 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:57:43 crc kubenswrapper[4848]: I1206 15:57:43.044041 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-p6tct"] Dec 06 15:57:43 crc kubenswrapper[4848]: I1206 15:57:43.054322 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-p6tct"] Dec 06 15:57:44 crc kubenswrapper[4848]: I1206 15:57:44.977017 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a892933-78a1-49d3-abee-8249ee831464" path="/var/lib/kubelet/pods/7a892933-78a1-49d3-abee-8249ee831464/volumes" Dec 06 15:57:45 crc kubenswrapper[4848]: I1206 15:57:45.966767 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:57:45 crc kubenswrapper[4848]: E1206 15:57:45.967227 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:57:48 crc kubenswrapper[4848]: I1206 15:57:48.026181 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-l4vmt"] Dec 06 15:57:48 crc kubenswrapper[4848]: I1206 15:57:48.037723 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-l4vmt"] Dec 06 15:57:48 crc kubenswrapper[4848]: I1206 15:57:48.978512 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b51e33-73e3-4dc5-83a7-fcbd0cc69930" path="/var/lib/kubelet/pods/e7b51e33-73e3-4dc5-83a7-fcbd0cc69930/volumes" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.031653 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-6625-account-create-update-7bnhs"] Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.045801 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-create-9w7v4"] Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.055331 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-6625-account-create-update-7bnhs"] Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.063251 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-create-9w7v4"] Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.362444 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6pn7m"] Dec 06 15:57:49 crc kubenswrapper[4848]: E1206 15:57:49.363009 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2daea31a-ee40-4b8f-bdb0-033f0196a154" containerName="extract-utilities" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.363029 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2daea31a-ee40-4b8f-bdb0-033f0196a154" containerName="extract-utilities" Dec 06 15:57:49 crc kubenswrapper[4848]: E1206 15:57:49.363044 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2daea31a-ee40-4b8f-bdb0-033f0196a154" containerName="extract-content" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.363053 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2daea31a-ee40-4b8f-bdb0-033f0196a154" containerName="extract-content" Dec 06 15:57:49 crc kubenswrapper[4848]: E1206 15:57:49.363096 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2daea31a-ee40-4b8f-bdb0-033f0196a154" containerName="registry-server" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.363116 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2daea31a-ee40-4b8f-bdb0-033f0196a154" containerName="registry-server" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.363349 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2daea31a-ee40-4b8f-bdb0-033f0196a154" containerName="registry-server" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.365141 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.376505 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6pn7m"] Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.455090 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmg9z\" (UniqueName: \"kubernetes.io/projected/a6402aed-c491-4371-9fc0-8b44ca21a7d9-kube-api-access-tmg9z\") pod \"redhat-operators-6pn7m\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.455161 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-catalog-content\") pod \"redhat-operators-6pn7m\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.455212 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-utilities\") pod \"redhat-operators-6pn7m\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.557163 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmg9z\" (UniqueName: \"kubernetes.io/projected/a6402aed-c491-4371-9fc0-8b44ca21a7d9-kube-api-access-tmg9z\") pod \"redhat-operators-6pn7m\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.557229 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-catalog-content\") pod \"redhat-operators-6pn7m\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.557250 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-utilities\") pod \"redhat-operators-6pn7m\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.557917 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-catalog-content\") pod \"redhat-operators-6pn7m\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.557936 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-utilities\") pod \"redhat-operators-6pn7m\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.580403 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmg9z\" (UniqueName: \"kubernetes.io/projected/a6402aed-c491-4371-9fc0-8b44ca21a7d9-kube-api-access-tmg9z\") pod \"redhat-operators-6pn7m\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:49 crc kubenswrapper[4848]: I1206 15:57:49.693990 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:50 crc kubenswrapper[4848]: I1206 15:57:50.249884 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6pn7m"] Dec 06 15:57:50 crc kubenswrapper[4848]: I1206 15:57:50.978673 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14d771cf-7f6f-474d-bfed-6e48e6deca38" path="/var/lib/kubelet/pods/14d771cf-7f6f-474d-bfed-6e48e6deca38/volumes" Dec 06 15:57:50 crc kubenswrapper[4848]: I1206 15:57:50.979772 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29032f23-2ae4-4ec4-8a4c-8f647d120974" path="/var/lib/kubelet/pods/29032f23-2ae4-4ec4-8a4c-8f647d120974/volumes" Dec 06 15:57:51 crc kubenswrapper[4848]: I1206 15:57:51.254942 4848 generic.go:334] "Generic (PLEG): container finished" podID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerID="179e68c79672f6501bd35f7630c1e9cf216f02429366021e80f42569f8376efa" exitCode=0 Dec 06 15:57:51 crc kubenswrapper[4848]: I1206 15:57:51.254998 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pn7m" event={"ID":"a6402aed-c491-4371-9fc0-8b44ca21a7d9","Type":"ContainerDied","Data":"179e68c79672f6501bd35f7630c1e9cf216f02429366021e80f42569f8376efa"} Dec 06 15:57:51 crc kubenswrapper[4848]: I1206 15:57:51.255030 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pn7m" event={"ID":"a6402aed-c491-4371-9fc0-8b44ca21a7d9","Type":"ContainerStarted","Data":"83d372d79f6a836f4308b31bede8963ffdf23f5484cd1f811d07b992382240d1"} Dec 06 15:57:51 crc kubenswrapper[4848]: I1206 15:57:51.257933 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 15:57:52 crc kubenswrapper[4848]: I1206 15:57:52.265210 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pn7m" event={"ID":"a6402aed-c491-4371-9fc0-8b44ca21a7d9","Type":"ContainerStarted","Data":"a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47"} Dec 06 15:57:54 crc kubenswrapper[4848]: I1206 15:57:54.284765 4848 generic.go:334] "Generic (PLEG): container finished" podID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerID="a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47" exitCode=0 Dec 06 15:57:54 crc kubenswrapper[4848]: I1206 15:57:54.284922 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pn7m" event={"ID":"a6402aed-c491-4371-9fc0-8b44ca21a7d9","Type":"ContainerDied","Data":"a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47"} Dec 06 15:57:55 crc kubenswrapper[4848]: I1206 15:57:55.296141 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pn7m" event={"ID":"a6402aed-c491-4371-9fc0-8b44ca21a7d9","Type":"ContainerStarted","Data":"b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae"} Dec 06 15:57:55 crc kubenswrapper[4848]: I1206 15:57:55.317231 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6pn7m" podStartSLOduration=2.7328335150000003 podStartE2EDuration="6.317214362s" podCreationTimestamp="2025-12-06 15:57:49 +0000 UTC" firstStartedPulling="2025-12-06 15:57:51.257710207 +0000 UTC m=+1738.555721120" lastFinishedPulling="2025-12-06 15:57:54.842091044 +0000 UTC m=+1742.140101967" observedRunningTime="2025-12-06 15:57:55.31601647 +0000 UTC m=+1742.614027383" watchObservedRunningTime="2025-12-06 15:57:55.317214362 +0000 UTC m=+1742.615225275" Dec 06 15:57:56 crc kubenswrapper[4848]: I1206 15:57:56.967513 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:57:56 crc kubenswrapper[4848]: E1206 15:57:56.968126 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:57:59 crc kubenswrapper[4848]: I1206 15:57:59.694921 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:57:59 crc kubenswrapper[4848]: I1206 15:57:59.695363 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:58:00 crc kubenswrapper[4848]: I1206 15:58:00.761504 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6pn7m" podUID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerName="registry-server" probeResult="failure" output=< Dec 06 15:58:00 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Dec 06 15:58:00 crc kubenswrapper[4848]: > Dec 06 15:58:09 crc kubenswrapper[4848]: I1206 15:58:09.747321 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:58:09 crc kubenswrapper[4848]: I1206 15:58:09.796512 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:58:09 crc kubenswrapper[4848]: I1206 15:58:09.987784 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6pn7m"] Dec 06 15:58:10 crc kubenswrapper[4848]: I1206 15:58:10.966621 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:58:10 crc kubenswrapper[4848]: E1206 15:58:10.967349 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:58:11 crc kubenswrapper[4848]: I1206 15:58:11.426898 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6pn7m" podUID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerName="registry-server" containerID="cri-o://b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae" gracePeriod=2 Dec 06 15:58:11 crc kubenswrapper[4848]: I1206 15:58:11.894597 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:58:11 crc kubenswrapper[4848]: I1206 15:58:11.990400 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-utilities\") pod \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " Dec 06 15:58:11 crc kubenswrapper[4848]: I1206 15:58:11.991640 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-catalog-content\") pod \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " Dec 06 15:58:11 crc kubenswrapper[4848]: I1206 15:58:11.991566 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-utilities" (OuterVolumeSpecName: "utilities") pod "a6402aed-c491-4371-9fc0-8b44ca21a7d9" (UID: "a6402aed-c491-4371-9fc0-8b44ca21a7d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.001516 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmg9z\" (UniqueName: \"kubernetes.io/projected/a6402aed-c491-4371-9fc0-8b44ca21a7d9-kube-api-access-tmg9z\") pod \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\" (UID: \"a6402aed-c491-4371-9fc0-8b44ca21a7d9\") " Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.002369 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.008091 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6402aed-c491-4371-9fc0-8b44ca21a7d9-kube-api-access-tmg9z" (OuterVolumeSpecName: "kube-api-access-tmg9z") pod "a6402aed-c491-4371-9fc0-8b44ca21a7d9" (UID: "a6402aed-c491-4371-9fc0-8b44ca21a7d9"). InnerVolumeSpecName "kube-api-access-tmg9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.104559 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmg9z\" (UniqueName: \"kubernetes.io/projected/a6402aed-c491-4371-9fc0-8b44ca21a7d9-kube-api-access-tmg9z\") on node \"crc\" DevicePath \"\"" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.107069 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6402aed-c491-4371-9fc0-8b44ca21a7d9" (UID: "a6402aed-c491-4371-9fc0-8b44ca21a7d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.206996 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6402aed-c491-4371-9fc0-8b44ca21a7d9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.437667 4848 generic.go:334] "Generic (PLEG): container finished" podID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerID="b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae" exitCode=0 Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.437719 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pn7m" event={"ID":"a6402aed-c491-4371-9fc0-8b44ca21a7d9","Type":"ContainerDied","Data":"b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae"} Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.437754 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pn7m" event={"ID":"a6402aed-c491-4371-9fc0-8b44ca21a7d9","Type":"ContainerDied","Data":"83d372d79f6a836f4308b31bede8963ffdf23f5484cd1f811d07b992382240d1"} Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.437780 4848 scope.go:117] "RemoveContainer" containerID="b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.437807 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6pn7m" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.461059 4848 scope.go:117] "RemoveContainer" containerID="a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.478867 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6pn7m"] Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.488193 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6pn7m"] Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.498040 4848 scope.go:117] "RemoveContainer" containerID="179e68c79672f6501bd35f7630c1e9cf216f02429366021e80f42569f8376efa" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.529801 4848 scope.go:117] "RemoveContainer" containerID="b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae" Dec 06 15:58:12 crc kubenswrapper[4848]: E1206 15:58:12.530291 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae\": container with ID starting with b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae not found: ID does not exist" containerID="b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.530324 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae"} err="failed to get container status \"b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae\": rpc error: code = NotFound desc = could not find container \"b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae\": container with ID starting with b205388c9da6210f20a5d083a1d5abaffd4c07103fd5efc258a553e2bb382dae not found: ID does not exist" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.530345 4848 scope.go:117] "RemoveContainer" containerID="a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47" Dec 06 15:58:12 crc kubenswrapper[4848]: E1206 15:58:12.530607 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47\": container with ID starting with a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47 not found: ID does not exist" containerID="a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.530648 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47"} err="failed to get container status \"a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47\": rpc error: code = NotFound desc = could not find container \"a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47\": container with ID starting with a707bd25575f723ef62b4d321d66207a3658c527d7c55cfd327117c8a9871d47 not found: ID does not exist" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.530673 4848 scope.go:117] "RemoveContainer" containerID="179e68c79672f6501bd35f7630c1e9cf216f02429366021e80f42569f8376efa" Dec 06 15:58:12 crc kubenswrapper[4848]: E1206 15:58:12.531189 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"179e68c79672f6501bd35f7630c1e9cf216f02429366021e80f42569f8376efa\": container with ID starting with 179e68c79672f6501bd35f7630c1e9cf216f02429366021e80f42569f8376efa not found: ID does not exist" containerID="179e68c79672f6501bd35f7630c1e9cf216f02429366021e80f42569f8376efa" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.531224 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179e68c79672f6501bd35f7630c1e9cf216f02429366021e80f42569f8376efa"} err="failed to get container status \"179e68c79672f6501bd35f7630c1e9cf216f02429366021e80f42569f8376efa\": rpc error: code = NotFound desc = could not find container \"179e68c79672f6501bd35f7630c1e9cf216f02429366021e80f42569f8376efa\": container with ID starting with 179e68c79672f6501bd35f7630c1e9cf216f02429366021e80f42569f8376efa not found: ID does not exist" Dec 06 15:58:12 crc kubenswrapper[4848]: I1206 15:58:12.978183 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" path="/var/lib/kubelet/pods/a6402aed-c491-4371-9fc0-8b44ca21a7d9/volumes" Dec 06 15:58:21 crc kubenswrapper[4848]: I1206 15:58:21.874434 4848 scope.go:117] "RemoveContainer" containerID="ba6d1ef608a1297fe15ddb76c6dba0807ef21db42c5af25d2b5d13fc8d687077" Dec 06 15:58:21 crc kubenswrapper[4848]: I1206 15:58:21.907116 4848 scope.go:117] "RemoveContainer" containerID="54d0c6bb49657f48afeb845d96777d9f1dcf881521a8dca5bb5bb6ec0d6922e7" Dec 06 15:58:21 crc kubenswrapper[4848]: I1206 15:58:21.946682 4848 scope.go:117] "RemoveContainer" containerID="3939b923cc20641397bfd5785f5c7a548310b3d92c805ac9d6c52af866e89ac2" Dec 06 15:58:21 crc kubenswrapper[4848]: I1206 15:58:21.967304 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:58:21 crc kubenswrapper[4848]: E1206 15:58:21.967560 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:58:22 crc kubenswrapper[4848]: I1206 15:58:22.008099 4848 scope.go:117] "RemoveContainer" containerID="ef4659fc99ee6cac30b751a280ea6d3c224353e1179e2cbca40da0d0ef12143d" Dec 06 15:58:22 crc kubenswrapper[4848]: I1206 15:58:22.042685 4848 scope.go:117] "RemoveContainer" containerID="c6081ff52d7a92534d5cd0e5ad138d067d5291ef1ffce09cd5888834425209f1" Dec 06 15:58:22 crc kubenswrapper[4848]: I1206 15:58:22.104463 4848 scope.go:117] "RemoveContainer" containerID="ce0e406f279064825e9529bec9f70f2aa4138cb90d6c059852d844bef90c3485" Dec 06 15:58:22 crc kubenswrapper[4848]: I1206 15:58:22.148365 4848 scope.go:117] "RemoveContainer" containerID="738119c3dc529bf6faa01ad310109bd04fa86661da868f2d4884b217ca9f9d88" Dec 06 15:58:22 crc kubenswrapper[4848]: I1206 15:58:22.182249 4848 scope.go:117] "RemoveContainer" containerID="b5dc10be32b6eac52d81b1eeb98008339d651ea544b0919f5080ebd870bd21ce" Dec 06 15:58:22 crc kubenswrapper[4848]: I1206 15:58:22.217043 4848 scope.go:117] "RemoveContainer" containerID="b2153f3a84bf7bf3848877a9b692a4d828ae0f91c2bcc800ee8fcc133d118375" Dec 06 15:58:22 crc kubenswrapper[4848]: I1206 15:58:22.243597 4848 scope.go:117] "RemoveContainer" containerID="62c4f4edbb0973c5f6aed65262e9556a3716e4ff42d3616c81a949895967746d" Dec 06 15:58:27 crc kubenswrapper[4848]: I1206 15:58:27.046619 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-q57fg"] Dec 06 15:58:27 crc kubenswrapper[4848]: I1206 15:58:27.055142 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-q57fg"] Dec 06 15:58:28 crc kubenswrapper[4848]: I1206 15:58:28.976163 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c533e36c-3e3e-4df0-85d8-81c87c5c8087" path="/var/lib/kubelet/pods/c533e36c-3e3e-4df0-85d8-81c87c5c8087/volumes" Dec 06 15:58:31 crc kubenswrapper[4848]: I1206 15:58:31.028674 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-f882q"] Dec 06 15:58:31 crc kubenswrapper[4848]: I1206 15:58:31.039125 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-f882q"] Dec 06 15:58:32 crc kubenswrapper[4848]: I1206 15:58:32.973770 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:58:32 crc kubenswrapper[4848]: E1206 15:58:32.974305 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:58:32 crc kubenswrapper[4848]: I1206 15:58:32.976951 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46997064-cc24-406e-8971-0cdbad196707" path="/var/lib/kubelet/pods/46997064-cc24-406e-8971-0cdbad196707/volumes" Dec 06 15:58:44 crc kubenswrapper[4848]: I1206 15:58:44.971686 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:58:44 crc kubenswrapper[4848]: E1206 15:58:44.974461 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 15:58:57 crc kubenswrapper[4848]: I1206 15:58:57.966325 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 15:58:58 crc kubenswrapper[4848]: I1206 15:58:58.875528 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"2e074eab4c8a7163650caab1136ef46934cfa12f99d31416b70294a37f45c54d"} Dec 06 15:58:59 crc kubenswrapper[4848]: I1206 15:58:59.044396 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-zqt4v"] Dec 06 15:58:59 crc kubenswrapper[4848]: I1206 15:58:59.053280 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-zqt4v"] Dec 06 15:59:00 crc kubenswrapper[4848]: I1206 15:59:00.979760 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4758d52-e17c-484e-96a3-4879daace03e" path="/var/lib/kubelet/pods/b4758d52-e17c-484e-96a3-4879daace03e/volumes" Dec 06 15:59:20 crc kubenswrapper[4848]: I1206 15:59:20.044153 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-s6zql"] Dec 06 15:59:20 crc kubenswrapper[4848]: I1206 15:59:20.078727 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-rfkvg"] Dec 06 15:59:20 crc kubenswrapper[4848]: I1206 15:59:20.086295 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-s6zql"] Dec 06 15:59:20 crc kubenswrapper[4848]: I1206 15:59:20.093787 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-rfkvg"] Dec 06 15:59:20 crc kubenswrapper[4848]: I1206 15:59:20.978374 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8d9713-c9fb-42c1-8496-03e949d82d8e" path="/var/lib/kubelet/pods/bb8d9713-c9fb-42c1-8496-03e949d82d8e/volumes" Dec 06 15:59:20 crc kubenswrapper[4848]: I1206 15:59:20.979589 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0186b0-9bb6-401b-bec2-80ee1058b4e8" path="/var/lib/kubelet/pods/dc0186b0-9bb6-401b-bec2-80ee1058b4e8/volumes" Dec 06 15:59:22 crc kubenswrapper[4848]: I1206 15:59:22.438195 4848 scope.go:117] "RemoveContainer" containerID="05affa8c282e92cb4ca335526850d4aa59af7970528d18157444e3ef1ad7a5d2" Dec 06 15:59:22 crc kubenswrapper[4848]: I1206 15:59:22.473407 4848 scope.go:117] "RemoveContainer" containerID="c59f025526913bd153a362ea60cd92bc58106084b528e88b90f3e57cee1f71af" Dec 06 15:59:22 crc kubenswrapper[4848]: I1206 15:59:22.537179 4848 scope.go:117] "RemoveContainer" containerID="1d28c15762b6288c63e38c8b80841f096bf6cb44b59512f1451441ff95905739" Dec 06 15:59:22 crc kubenswrapper[4848]: I1206 15:59:22.577941 4848 scope.go:117] "RemoveContainer" containerID="1f000778b9ab1df593521956c011d2ce5a5572717d7dfa8c2a84f7bdc0f3e387" Dec 06 15:59:22 crc kubenswrapper[4848]: I1206 15:59:22.617656 4848 scope.go:117] "RemoveContainer" containerID="db7fef949e7c57679ee0769869c038c2a6713e7c9061b14379f48c9031fd98bf" Dec 06 15:59:28 crc kubenswrapper[4848]: I1206 15:59:28.026885 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-5ct5k"] Dec 06 15:59:28 crc kubenswrapper[4848]: I1206 15:59:28.037351 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-5ct5k"] Dec 06 15:59:28 crc kubenswrapper[4848]: I1206 15:59:28.976219 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af5a0b7-5219-4bf8-9e36-87218655227b" path="/var/lib/kubelet/pods/9af5a0b7-5219-4bf8-9e36-87218655227b/volumes" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.032604 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-9t25c"] Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.043681 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-wxr2n"] Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.055458 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c7d9-account-create-update-fx892"] Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.064984 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-9t25c"] Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.073904 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-wxr2n"] Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.083164 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c7d9-account-create-update-fx892"] Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.642226 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qlx96/must-gather-7jnlk"] Dec 06 15:59:30 crc kubenswrapper[4848]: E1206 15:59:30.642885 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerName="extract-utilities" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.642903 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerName="extract-utilities" Dec 06 15:59:30 crc kubenswrapper[4848]: E1206 15:59:30.642929 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerName="registry-server" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.642937 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerName="registry-server" Dec 06 15:59:30 crc kubenswrapper[4848]: E1206 15:59:30.642962 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerName="extract-content" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.642968 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerName="extract-content" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.643150 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6402aed-c491-4371-9fc0-8b44ca21a7d9" containerName="registry-server" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.644118 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/must-gather-7jnlk" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.646896 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qlx96"/"default-dockercfg-rbll6" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.647075 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qlx96"/"kube-root-ca.crt" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.647724 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qlx96"/"openshift-service-ca.crt" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.659311 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qlx96/must-gather-7jnlk"] Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.800955 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71bd67b7-42c5-4384-8431-07f05f3ae0a1-must-gather-output\") pod \"must-gather-7jnlk\" (UID: \"71bd67b7-42c5-4384-8431-07f05f3ae0a1\") " pod="openshift-must-gather-qlx96/must-gather-7jnlk" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.801069 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dq6j\" (UniqueName: \"kubernetes.io/projected/71bd67b7-42c5-4384-8431-07f05f3ae0a1-kube-api-access-7dq6j\") pod \"must-gather-7jnlk\" (UID: \"71bd67b7-42c5-4384-8431-07f05f3ae0a1\") " pod="openshift-must-gather-qlx96/must-gather-7jnlk" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.902859 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71bd67b7-42c5-4384-8431-07f05f3ae0a1-must-gather-output\") pod \"must-gather-7jnlk\" (UID: \"71bd67b7-42c5-4384-8431-07f05f3ae0a1\") " pod="openshift-must-gather-qlx96/must-gather-7jnlk" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.902960 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dq6j\" (UniqueName: \"kubernetes.io/projected/71bd67b7-42c5-4384-8431-07f05f3ae0a1-kube-api-access-7dq6j\") pod \"must-gather-7jnlk\" (UID: \"71bd67b7-42c5-4384-8431-07f05f3ae0a1\") " pod="openshift-must-gather-qlx96/must-gather-7jnlk" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.903316 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71bd67b7-42c5-4384-8431-07f05f3ae0a1-must-gather-output\") pod \"must-gather-7jnlk\" (UID: \"71bd67b7-42c5-4384-8431-07f05f3ae0a1\") " pod="openshift-must-gather-qlx96/must-gather-7jnlk" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.921407 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dq6j\" (UniqueName: \"kubernetes.io/projected/71bd67b7-42c5-4384-8431-07f05f3ae0a1-kube-api-access-7dq6j\") pod \"must-gather-7jnlk\" (UID: \"71bd67b7-42c5-4384-8431-07f05f3ae0a1\") " pod="openshift-must-gather-qlx96/must-gather-7jnlk" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.965023 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/must-gather-7jnlk" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.991830 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81599cc7-40f1-4f1c-96e5-9465b89b0517" path="/var/lib/kubelet/pods/81599cc7-40f1-4f1c-96e5-9465b89b0517/volumes" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.992549 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9624a1f-b378-4f0a-af01-12f45f8c7694" path="/var/lib/kubelet/pods/d9624a1f-b378-4f0a-af01-12f45f8c7694/volumes" Dec 06 15:59:30 crc kubenswrapper[4848]: I1206 15:59:30.993204 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3358e13-6e57-435c-ba4f-671c970019fd" path="/var/lib/kubelet/pods/e3358e13-6e57-435c-ba4f-671c970019fd/volumes" Dec 06 15:59:31 crc kubenswrapper[4848]: I1206 15:59:31.032768 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8c96-account-create-update-r6sqf"] Dec 06 15:59:31 crc kubenswrapper[4848]: I1206 15:59:31.040877 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b459-account-create-update-7d7s8"] Dec 06 15:59:31 crc kubenswrapper[4848]: I1206 15:59:31.049785 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8c96-account-create-update-r6sqf"] Dec 06 15:59:31 crc kubenswrapper[4848]: I1206 15:59:31.057288 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b459-account-create-update-7d7s8"] Dec 06 15:59:31 crc kubenswrapper[4848]: I1206 15:59:31.455095 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qlx96/must-gather-7jnlk"] Dec 06 15:59:32 crc kubenswrapper[4848]: I1206 15:59:32.195843 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qlx96/must-gather-7jnlk" event={"ID":"71bd67b7-42c5-4384-8431-07f05f3ae0a1","Type":"ContainerStarted","Data":"a3265a3300b6e4733d7d97f87e03a5942ed2c8a005656a6cdcc2e16cb525ec5e"} Dec 06 15:59:32 crc kubenswrapper[4848]: I1206 15:59:32.978251 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d15aeb9-60fe-4de1-a715-7843431f9f7f" path="/var/lib/kubelet/pods/7d15aeb9-60fe-4de1-a715-7843431f9f7f/volumes" Dec 06 15:59:32 crc kubenswrapper[4848]: I1206 15:59:32.979088 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4ae7cfb-1f4e-4947-94c8-d358f5e36476" path="/var/lib/kubelet/pods/f4ae7cfb-1f4e-4947-94c8-d358f5e36476/volumes" Dec 06 15:59:35 crc kubenswrapper[4848]: I1206 15:59:35.036692 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-572c-account-create-update-j47lb"] Dec 06 15:59:35 crc kubenswrapper[4848]: I1206 15:59:35.049708 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-572c-account-create-update-j47lb"] Dec 06 15:59:36 crc kubenswrapper[4848]: I1206 15:59:36.025952 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-create-5sg59"] Dec 06 15:59:36 crc kubenswrapper[4848]: I1206 15:59:36.037100 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-create-5sg59"] Dec 06 15:59:36 crc kubenswrapper[4848]: I1206 15:59:36.230243 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qlx96/must-gather-7jnlk" event={"ID":"71bd67b7-42c5-4384-8431-07f05f3ae0a1","Type":"ContainerStarted","Data":"2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a"} Dec 06 15:59:36 crc kubenswrapper[4848]: I1206 15:59:36.230297 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qlx96/must-gather-7jnlk" event={"ID":"71bd67b7-42c5-4384-8431-07f05f3ae0a1","Type":"ContainerStarted","Data":"ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60"} Dec 06 15:59:36 crc kubenswrapper[4848]: I1206 15:59:36.254047 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qlx96/must-gather-7jnlk" podStartSLOduration=2.381527273 podStartE2EDuration="6.254028441s" podCreationTimestamp="2025-12-06 15:59:30 +0000 UTC" firstStartedPulling="2025-12-06 15:59:31.461997177 +0000 UTC m=+1838.760008100" lastFinishedPulling="2025-12-06 15:59:35.334498355 +0000 UTC m=+1842.632509268" observedRunningTime="2025-12-06 15:59:36.246213039 +0000 UTC m=+1843.544223952" watchObservedRunningTime="2025-12-06 15:59:36.254028441 +0000 UTC m=+1843.552039354" Dec 06 15:59:36 crc kubenswrapper[4848]: I1206 15:59:36.978975 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03352dd8-a07d-4822-adcb-64517cb16b2e" path="/var/lib/kubelet/pods/03352dd8-a07d-4822-adcb-64517cb16b2e/volumes" Dec 06 15:59:36 crc kubenswrapper[4848]: I1206 15:59:36.979517 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d0d42e-a46f-49c4-a637-81da68446876" path="/var/lib/kubelet/pods/69d0d42e-a46f-49c4-a637-81da68446876/volumes" Dec 06 15:59:38 crc kubenswrapper[4848]: I1206 15:59:38.926071 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qlx96/crc-debug-tvd8r"] Dec 06 15:59:38 crc kubenswrapper[4848]: I1206 15:59:38.927537 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-tvd8r" Dec 06 15:59:39 crc kubenswrapper[4848]: I1206 15:59:39.059903 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50dca36b-39b4-434b-97e7-c0ef3fd8920c-host\") pod \"crc-debug-tvd8r\" (UID: \"50dca36b-39b4-434b-97e7-c0ef3fd8920c\") " pod="openshift-must-gather-qlx96/crc-debug-tvd8r" Dec 06 15:59:39 crc kubenswrapper[4848]: I1206 15:59:39.060335 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xf75\" (UniqueName: \"kubernetes.io/projected/50dca36b-39b4-434b-97e7-c0ef3fd8920c-kube-api-access-8xf75\") pod \"crc-debug-tvd8r\" (UID: \"50dca36b-39b4-434b-97e7-c0ef3fd8920c\") " pod="openshift-must-gather-qlx96/crc-debug-tvd8r" Dec 06 15:59:39 crc kubenswrapper[4848]: I1206 15:59:39.162929 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50dca36b-39b4-434b-97e7-c0ef3fd8920c-host\") pod \"crc-debug-tvd8r\" (UID: \"50dca36b-39b4-434b-97e7-c0ef3fd8920c\") " pod="openshift-must-gather-qlx96/crc-debug-tvd8r" Dec 06 15:59:39 crc kubenswrapper[4848]: I1206 15:59:39.163129 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50dca36b-39b4-434b-97e7-c0ef3fd8920c-host\") pod \"crc-debug-tvd8r\" (UID: \"50dca36b-39b4-434b-97e7-c0ef3fd8920c\") " pod="openshift-must-gather-qlx96/crc-debug-tvd8r" Dec 06 15:59:39 crc kubenswrapper[4848]: I1206 15:59:39.163268 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xf75\" (UniqueName: \"kubernetes.io/projected/50dca36b-39b4-434b-97e7-c0ef3fd8920c-kube-api-access-8xf75\") pod \"crc-debug-tvd8r\" (UID: \"50dca36b-39b4-434b-97e7-c0ef3fd8920c\") " pod="openshift-must-gather-qlx96/crc-debug-tvd8r" Dec 06 15:59:39 crc kubenswrapper[4848]: I1206 15:59:39.200523 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xf75\" (UniqueName: \"kubernetes.io/projected/50dca36b-39b4-434b-97e7-c0ef3fd8920c-kube-api-access-8xf75\") pod \"crc-debug-tvd8r\" (UID: \"50dca36b-39b4-434b-97e7-c0ef3fd8920c\") " pod="openshift-must-gather-qlx96/crc-debug-tvd8r" Dec 06 15:59:39 crc kubenswrapper[4848]: I1206 15:59:39.244598 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-tvd8r" Dec 06 15:59:39 crc kubenswrapper[4848]: W1206 15:59:39.287437 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50dca36b_39b4_434b_97e7_c0ef3fd8920c.slice/crio-25e47eb373b797647ad4a4a5fde06339d8a70670c7fdca274c348a5dbe206aaa WatchSource:0}: Error finding container 25e47eb373b797647ad4a4a5fde06339d8a70670c7fdca274c348a5dbe206aaa: Status 404 returned error can't find the container with id 25e47eb373b797647ad4a4a5fde06339d8a70670c7fdca274c348a5dbe206aaa Dec 06 15:59:40 crc kubenswrapper[4848]: I1206 15:59:40.273565 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qlx96/crc-debug-tvd8r" event={"ID":"50dca36b-39b4-434b-97e7-c0ef3fd8920c","Type":"ContainerStarted","Data":"25e47eb373b797647ad4a4a5fde06339d8a70670c7fdca274c348a5dbe206aaa"} Dec 06 15:59:52 crc kubenswrapper[4848]: I1206 15:59:52.423853 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qlx96/crc-debug-tvd8r" event={"ID":"50dca36b-39b4-434b-97e7-c0ef3fd8920c","Type":"ContainerStarted","Data":"c8f9d6826b794120f097112e43636b444dc32c3c59e4b55bb7188d52d22fc057"} Dec 06 15:59:52 crc kubenswrapper[4848]: I1206 15:59:52.444062 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qlx96/crc-debug-tvd8r" podStartSLOduration=2.108343906 podStartE2EDuration="14.444045142s" podCreationTimestamp="2025-12-06 15:59:38 +0000 UTC" firstStartedPulling="2025-12-06 15:59:39.290237389 +0000 UTC m=+1846.588248302" lastFinishedPulling="2025-12-06 15:59:51.625938625 +0000 UTC m=+1858.923949538" observedRunningTime="2025-12-06 15:59:52.438074821 +0000 UTC m=+1859.736085744" watchObservedRunningTime="2025-12-06 15:59:52.444045142 +0000 UTC m=+1859.742056045" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.149236 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr"] Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.151256 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.152961 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.153344 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.157393 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr"] Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.252378 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m276\" (UniqueName: \"kubernetes.io/projected/b5a77c57-f806-4c6d-8734-01c4ad3239c2-kube-api-access-4m276\") pod \"collect-profiles-29417280-5n4vr\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.252622 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5a77c57-f806-4c6d-8734-01c4ad3239c2-secret-volume\") pod \"collect-profiles-29417280-5n4vr\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.252805 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5a77c57-f806-4c6d-8734-01c4ad3239c2-config-volume\") pod \"collect-profiles-29417280-5n4vr\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.354840 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m276\" (UniqueName: \"kubernetes.io/projected/b5a77c57-f806-4c6d-8734-01c4ad3239c2-kube-api-access-4m276\") pod \"collect-profiles-29417280-5n4vr\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.354952 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5a77c57-f806-4c6d-8734-01c4ad3239c2-secret-volume\") pod \"collect-profiles-29417280-5n4vr\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.355000 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5a77c57-f806-4c6d-8734-01c4ad3239c2-config-volume\") pod \"collect-profiles-29417280-5n4vr\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.355855 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5a77c57-f806-4c6d-8734-01c4ad3239c2-config-volume\") pod \"collect-profiles-29417280-5n4vr\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.360667 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5a77c57-f806-4c6d-8734-01c4ad3239c2-secret-volume\") pod \"collect-profiles-29417280-5n4vr\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.373041 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m276\" (UniqueName: \"kubernetes.io/projected/b5a77c57-f806-4c6d-8734-01c4ad3239c2-kube-api-access-4m276\") pod \"collect-profiles-29417280-5n4vr\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:00 crc kubenswrapper[4848]: I1206 16:00:00.476225 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:07 crc kubenswrapper[4848]: I1206 16:00:07.456037 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr"] Dec 06 16:00:07 crc kubenswrapper[4848]: I1206 16:00:07.569890 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" event={"ID":"b5a77c57-f806-4c6d-8734-01c4ad3239c2","Type":"ContainerStarted","Data":"a63d126a62c370249a9688894877c38ea46cabc93e8b3f00277112655c2b8143"} Dec 06 16:00:08 crc kubenswrapper[4848]: I1206 16:00:08.580080 4848 generic.go:334] "Generic (PLEG): container finished" podID="b5a77c57-f806-4c6d-8734-01c4ad3239c2" containerID="be838589f73395024b66d0cf715ba951765a2d63f4a89292d9590d7538f8c9df" exitCode=0 Dec 06 16:00:08 crc kubenswrapper[4848]: I1206 16:00:08.580120 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" event={"ID":"b5a77c57-f806-4c6d-8734-01c4ad3239c2","Type":"ContainerDied","Data":"be838589f73395024b66d0cf715ba951765a2d63f4a89292d9590d7538f8c9df"} Dec 06 16:00:09 crc kubenswrapper[4848]: I1206 16:00:09.922236 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.041659 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m276\" (UniqueName: \"kubernetes.io/projected/b5a77c57-f806-4c6d-8734-01c4ad3239c2-kube-api-access-4m276\") pod \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.041765 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5a77c57-f806-4c6d-8734-01c4ad3239c2-config-volume\") pod \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.041793 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5a77c57-f806-4c6d-8734-01c4ad3239c2-secret-volume\") pod \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\" (UID: \"b5a77c57-f806-4c6d-8734-01c4ad3239c2\") " Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.042604 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5a77c57-f806-4c6d-8734-01c4ad3239c2-config-volume" (OuterVolumeSpecName: "config-volume") pod "b5a77c57-f806-4c6d-8734-01c4ad3239c2" (UID: "b5a77c57-f806-4c6d-8734-01c4ad3239c2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.060712 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a77c57-f806-4c6d-8734-01c4ad3239c2-kube-api-access-4m276" (OuterVolumeSpecName: "kube-api-access-4m276") pod "b5a77c57-f806-4c6d-8734-01c4ad3239c2" (UID: "b5a77c57-f806-4c6d-8734-01c4ad3239c2"). InnerVolumeSpecName "kube-api-access-4m276". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.061602 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a77c57-f806-4c6d-8734-01c4ad3239c2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b5a77c57-f806-4c6d-8734-01c4ad3239c2" (UID: "b5a77c57-f806-4c6d-8734-01c4ad3239c2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.144506 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m276\" (UniqueName: \"kubernetes.io/projected/b5a77c57-f806-4c6d-8734-01c4ad3239c2-kube-api-access-4m276\") on node \"crc\" DevicePath \"\"" Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.144545 4848 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5a77c57-f806-4c6d-8734-01c4ad3239c2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.144553 4848 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5a77c57-f806-4c6d-8734-01c4ad3239c2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.598422 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" event={"ID":"b5a77c57-f806-4c6d-8734-01c4ad3239c2","Type":"ContainerDied","Data":"a63d126a62c370249a9688894877c38ea46cabc93e8b3f00277112655c2b8143"} Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.598462 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a63d126a62c370249a9688894877c38ea46cabc93e8b3f00277112655c2b8143" Dec 06 16:00:10 crc kubenswrapper[4848]: I1206 16:00:10.598488 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29417280-5n4vr" Dec 06 16:00:22 crc kubenswrapper[4848]: I1206 16:00:22.782225 4848 scope.go:117] "RemoveContainer" containerID="d1db1c5f39966e4065b63ca4719f2e1b399ba6f6336fd2f892f6a886f7a61be9" Dec 06 16:00:22 crc kubenswrapper[4848]: I1206 16:00:22.834181 4848 scope.go:117] "RemoveContainer" containerID="725cf6c1857963b710d76c9bc2b2ed1a2af1d1e43ab68b22b140a0741d3cc914" Dec 06 16:00:22 crc kubenswrapper[4848]: I1206 16:00:22.874103 4848 scope.go:117] "RemoveContainer" containerID="b4857ac3c41c0b1802966e2f124cb66a8faedaa7d83295efed37e6c6b472aa46" Dec 06 16:00:22 crc kubenswrapper[4848]: I1206 16:00:22.926618 4848 scope.go:117] "RemoveContainer" containerID="5fd6129be1335ee4ff05e8297387246755952393d2aabd6d7348ca9d3217ced0" Dec 06 16:00:22 crc kubenswrapper[4848]: I1206 16:00:22.978974 4848 scope.go:117] "RemoveContainer" containerID="6bcdd131a08cd0c20a53527616c28bf91c743559589348c37fe7ab6cceaa4743" Dec 06 16:00:23 crc kubenswrapper[4848]: I1206 16:00:23.029783 4848 scope.go:117] "RemoveContainer" containerID="7c693a591a909a1baedb6df1ea688682e5b6abacaf3c4cde2b528e2328a4e385" Dec 06 16:00:23 crc kubenswrapper[4848]: I1206 16:00:23.082890 4848 scope.go:117] "RemoveContainer" containerID="b0fa0e34a4343cdfaadf196b3ac94cefd6cf6be7b1b969f0308f150eb55e6555" Dec 06 16:00:23 crc kubenswrapper[4848]: I1206 16:00:23.103691 4848 scope.go:117] "RemoveContainer" containerID="737e921d23414808f8aeb148a7ce3b5ebadcebc6c176c774b58b32a428cbb62e" Dec 06 16:00:36 crc kubenswrapper[4848]: I1206 16:00:36.817914 4848 generic.go:334] "Generic (PLEG): container finished" podID="50dca36b-39b4-434b-97e7-c0ef3fd8920c" containerID="c8f9d6826b794120f097112e43636b444dc32c3c59e4b55bb7188d52d22fc057" exitCode=0 Dec 06 16:00:36 crc kubenswrapper[4848]: I1206 16:00:36.817987 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qlx96/crc-debug-tvd8r" event={"ID":"50dca36b-39b4-434b-97e7-c0ef3fd8920c","Type":"ContainerDied","Data":"c8f9d6826b794120f097112e43636b444dc32c3c59e4b55bb7188d52d22fc057"} Dec 06 16:00:37 crc kubenswrapper[4848]: I1206 16:00:37.934011 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-tvd8r" Dec 06 16:00:37 crc kubenswrapper[4848]: I1206 16:00:37.963659 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qlx96/crc-debug-tvd8r"] Dec 06 16:00:37 crc kubenswrapper[4848]: I1206 16:00:37.971093 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qlx96/crc-debug-tvd8r"] Dec 06 16:00:38 crc kubenswrapper[4848]: I1206 16:00:38.010753 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50dca36b-39b4-434b-97e7-c0ef3fd8920c-host\") pod \"50dca36b-39b4-434b-97e7-c0ef3fd8920c\" (UID: \"50dca36b-39b4-434b-97e7-c0ef3fd8920c\") " Dec 06 16:00:38 crc kubenswrapper[4848]: I1206 16:00:38.010854 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50dca36b-39b4-434b-97e7-c0ef3fd8920c-host" (OuterVolumeSpecName: "host") pod "50dca36b-39b4-434b-97e7-c0ef3fd8920c" (UID: "50dca36b-39b4-434b-97e7-c0ef3fd8920c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 16:00:38 crc kubenswrapper[4848]: I1206 16:00:38.011045 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xf75\" (UniqueName: \"kubernetes.io/projected/50dca36b-39b4-434b-97e7-c0ef3fd8920c-kube-api-access-8xf75\") pod \"50dca36b-39b4-434b-97e7-c0ef3fd8920c\" (UID: \"50dca36b-39b4-434b-97e7-c0ef3fd8920c\") " Dec 06 16:00:38 crc kubenswrapper[4848]: I1206 16:00:38.013474 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50dca36b-39b4-434b-97e7-c0ef3fd8920c-host\") on node \"crc\" DevicePath \"\"" Dec 06 16:00:38 crc kubenswrapper[4848]: I1206 16:00:38.018680 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50dca36b-39b4-434b-97e7-c0ef3fd8920c-kube-api-access-8xf75" (OuterVolumeSpecName: "kube-api-access-8xf75") pod "50dca36b-39b4-434b-97e7-c0ef3fd8920c" (UID: "50dca36b-39b4-434b-97e7-c0ef3fd8920c"). InnerVolumeSpecName "kube-api-access-8xf75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:00:38 crc kubenswrapper[4848]: I1206 16:00:38.115971 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xf75\" (UniqueName: \"kubernetes.io/projected/50dca36b-39b4-434b-97e7-c0ef3fd8920c-kube-api-access-8xf75\") on node \"crc\" DevicePath \"\"" Dec 06 16:00:38 crc kubenswrapper[4848]: I1206 16:00:38.846577 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25e47eb373b797647ad4a4a5fde06339d8a70670c7fdca274c348a5dbe206aaa" Dec 06 16:00:38 crc kubenswrapper[4848]: I1206 16:00:38.846719 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-tvd8r" Dec 06 16:00:38 crc kubenswrapper[4848]: I1206 16:00:38.982022 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50dca36b-39b4-434b-97e7-c0ef3fd8920c" path="/var/lib/kubelet/pods/50dca36b-39b4-434b-97e7-c0ef3fd8920c/volumes" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.041918 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-76qm4"] Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.052208 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-76qm4"] Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.178604 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qlx96/crc-debug-kj8kb"] Dec 06 16:00:39 crc kubenswrapper[4848]: E1206 16:00:39.179127 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a77c57-f806-4c6d-8734-01c4ad3239c2" containerName="collect-profiles" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.179154 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a77c57-f806-4c6d-8734-01c4ad3239c2" containerName="collect-profiles" Dec 06 16:00:39 crc kubenswrapper[4848]: E1206 16:00:39.179176 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50dca36b-39b4-434b-97e7-c0ef3fd8920c" containerName="container-00" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.179186 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="50dca36b-39b4-434b-97e7-c0ef3fd8920c" containerName="container-00" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.179458 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a77c57-f806-4c6d-8734-01c4ad3239c2" containerName="collect-profiles" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.179485 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="50dca36b-39b4-434b-97e7-c0ef3fd8920c" containerName="container-00" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.180656 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-kj8kb" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.235660 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwb88\" (UniqueName: \"kubernetes.io/projected/060ec9d2-6752-4521-8559-153f39829b3e-kube-api-access-dwb88\") pod \"crc-debug-kj8kb\" (UID: \"060ec9d2-6752-4521-8559-153f39829b3e\") " pod="openshift-must-gather-qlx96/crc-debug-kj8kb" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.235756 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/060ec9d2-6752-4521-8559-153f39829b3e-host\") pod \"crc-debug-kj8kb\" (UID: \"060ec9d2-6752-4521-8559-153f39829b3e\") " pod="openshift-must-gather-qlx96/crc-debug-kj8kb" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.336867 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/060ec9d2-6752-4521-8559-153f39829b3e-host\") pod \"crc-debug-kj8kb\" (UID: \"060ec9d2-6752-4521-8559-153f39829b3e\") " pod="openshift-must-gather-qlx96/crc-debug-kj8kb" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.336988 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/060ec9d2-6752-4521-8559-153f39829b3e-host\") pod \"crc-debug-kj8kb\" (UID: \"060ec9d2-6752-4521-8559-153f39829b3e\") " pod="openshift-must-gather-qlx96/crc-debug-kj8kb" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.337050 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwb88\" (UniqueName: \"kubernetes.io/projected/060ec9d2-6752-4521-8559-153f39829b3e-kube-api-access-dwb88\") pod \"crc-debug-kj8kb\" (UID: \"060ec9d2-6752-4521-8559-153f39829b3e\") " pod="openshift-must-gather-qlx96/crc-debug-kj8kb" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.355525 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwb88\" (UniqueName: \"kubernetes.io/projected/060ec9d2-6752-4521-8559-153f39829b3e-kube-api-access-dwb88\") pod \"crc-debug-kj8kb\" (UID: \"060ec9d2-6752-4521-8559-153f39829b3e\") " pod="openshift-must-gather-qlx96/crc-debug-kj8kb" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.503991 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-kj8kb" Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.856197 4848 generic.go:334] "Generic (PLEG): container finished" podID="060ec9d2-6752-4521-8559-153f39829b3e" containerID="8b2771178ba29b0479ca964a4235099c06b71645a0dde4dd35bb6f07b3b27354" exitCode=0 Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.856284 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qlx96/crc-debug-kj8kb" event={"ID":"060ec9d2-6752-4521-8559-153f39829b3e","Type":"ContainerDied","Data":"8b2771178ba29b0479ca964a4235099c06b71645a0dde4dd35bb6f07b3b27354"} Dec 06 16:00:39 crc kubenswrapper[4848]: I1206 16:00:39.856460 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qlx96/crc-debug-kj8kb" event={"ID":"060ec9d2-6752-4521-8559-153f39829b3e","Type":"ContainerStarted","Data":"3c9da7a105310ac8134174aa5aecb3b8423b7154fb6891b741fb46953ac949c5"} Dec 06 16:00:40 crc kubenswrapper[4848]: I1206 16:00:40.307209 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qlx96/crc-debug-kj8kb"] Dec 06 16:00:40 crc kubenswrapper[4848]: I1206 16:00:40.316093 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qlx96/crc-debug-kj8kb"] Dec 06 16:00:40 crc kubenswrapper[4848]: I1206 16:00:40.956139 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-kj8kb" Dec 06 16:00:40 crc kubenswrapper[4848]: I1206 16:00:40.976792 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a42d59-df8c-420d-bb24-c8476a868dd9" path="/var/lib/kubelet/pods/95a42d59-df8c-420d-bb24-c8476a868dd9/volumes" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.062112 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/060ec9d2-6752-4521-8559-153f39829b3e-host\") pod \"060ec9d2-6752-4521-8559-153f39829b3e\" (UID: \"060ec9d2-6752-4521-8559-153f39829b3e\") " Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.062193 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwb88\" (UniqueName: \"kubernetes.io/projected/060ec9d2-6752-4521-8559-153f39829b3e-kube-api-access-dwb88\") pod \"060ec9d2-6752-4521-8559-153f39829b3e\" (UID: \"060ec9d2-6752-4521-8559-153f39829b3e\") " Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.062539 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/060ec9d2-6752-4521-8559-153f39829b3e-host" (OuterVolumeSpecName: "host") pod "060ec9d2-6752-4521-8559-153f39829b3e" (UID: "060ec9d2-6752-4521-8559-153f39829b3e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.062839 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/060ec9d2-6752-4521-8559-153f39829b3e-host\") on node \"crc\" DevicePath \"\"" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.067910 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/060ec9d2-6752-4521-8559-153f39829b3e-kube-api-access-dwb88" (OuterVolumeSpecName: "kube-api-access-dwb88") pod "060ec9d2-6752-4521-8559-153f39829b3e" (UID: "060ec9d2-6752-4521-8559-153f39829b3e"). InnerVolumeSpecName "kube-api-access-dwb88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.164680 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwb88\" (UniqueName: \"kubernetes.io/projected/060ec9d2-6752-4521-8559-153f39829b3e-kube-api-access-dwb88\") on node \"crc\" DevicePath \"\"" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.439712 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qlx96/crc-debug-wq6s9"] Dec 06 16:00:41 crc kubenswrapper[4848]: E1206 16:00:41.440139 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060ec9d2-6752-4521-8559-153f39829b3e" containerName="container-00" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.440153 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="060ec9d2-6752-4521-8559-153f39829b3e" containerName="container-00" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.440357 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="060ec9d2-6752-4521-8559-153f39829b3e" containerName="container-00" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.440999 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-wq6s9" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.469974 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgfft\" (UniqueName: \"kubernetes.io/projected/b5135505-51b1-48ed-9338-6cf5800570d6-kube-api-access-tgfft\") pod \"crc-debug-wq6s9\" (UID: \"b5135505-51b1-48ed-9338-6cf5800570d6\") " pod="openshift-must-gather-qlx96/crc-debug-wq6s9" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.470059 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5135505-51b1-48ed-9338-6cf5800570d6-host\") pod \"crc-debug-wq6s9\" (UID: \"b5135505-51b1-48ed-9338-6cf5800570d6\") " pod="openshift-must-gather-qlx96/crc-debug-wq6s9" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.571298 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgfft\" (UniqueName: \"kubernetes.io/projected/b5135505-51b1-48ed-9338-6cf5800570d6-kube-api-access-tgfft\") pod \"crc-debug-wq6s9\" (UID: \"b5135505-51b1-48ed-9338-6cf5800570d6\") " pod="openshift-must-gather-qlx96/crc-debug-wq6s9" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.571367 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5135505-51b1-48ed-9338-6cf5800570d6-host\") pod \"crc-debug-wq6s9\" (UID: \"b5135505-51b1-48ed-9338-6cf5800570d6\") " pod="openshift-must-gather-qlx96/crc-debug-wq6s9" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.571586 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5135505-51b1-48ed-9338-6cf5800570d6-host\") pod \"crc-debug-wq6s9\" (UID: \"b5135505-51b1-48ed-9338-6cf5800570d6\") " pod="openshift-must-gather-qlx96/crc-debug-wq6s9" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.591825 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgfft\" (UniqueName: \"kubernetes.io/projected/b5135505-51b1-48ed-9338-6cf5800570d6-kube-api-access-tgfft\") pod \"crc-debug-wq6s9\" (UID: \"b5135505-51b1-48ed-9338-6cf5800570d6\") " pod="openshift-must-gather-qlx96/crc-debug-wq6s9" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.757914 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-wq6s9" Dec 06 16:00:41 crc kubenswrapper[4848]: W1206 16:00:41.789942 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5135505_51b1_48ed_9338_6cf5800570d6.slice/crio-20a4603f5c2ce0fb3694b1482dfb89f28c120b6455010ed023cee0cf3a1c6124 WatchSource:0}: Error finding container 20a4603f5c2ce0fb3694b1482dfb89f28c120b6455010ed023cee0cf3a1c6124: Status 404 returned error can't find the container with id 20a4603f5c2ce0fb3694b1482dfb89f28c120b6455010ed023cee0cf3a1c6124 Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.871135 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qlx96/crc-debug-wq6s9" event={"ID":"b5135505-51b1-48ed-9338-6cf5800570d6","Type":"ContainerStarted","Data":"20a4603f5c2ce0fb3694b1482dfb89f28c120b6455010ed023cee0cf3a1c6124"} Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.872508 4848 scope.go:117] "RemoveContainer" containerID="8b2771178ba29b0479ca964a4235099c06b71645a0dde4dd35bb6f07b3b27354" Dec 06 16:00:41 crc kubenswrapper[4848]: I1206 16:00:41.872553 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-kj8kb" Dec 06 16:00:42 crc kubenswrapper[4848]: I1206 16:00:42.883110 4848 generic.go:334] "Generic (PLEG): container finished" podID="b5135505-51b1-48ed-9338-6cf5800570d6" containerID="35eea3dc99cd22896104bd9d4e9b8d075f04a2e801439b1bca90e54a37d8cbf4" exitCode=0 Dec 06 16:00:42 crc kubenswrapper[4848]: I1206 16:00:42.883149 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qlx96/crc-debug-wq6s9" event={"ID":"b5135505-51b1-48ed-9338-6cf5800570d6","Type":"ContainerDied","Data":"35eea3dc99cd22896104bd9d4e9b8d075f04a2e801439b1bca90e54a37d8cbf4"} Dec 06 16:00:42 crc kubenswrapper[4848]: I1206 16:00:42.914774 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qlx96/crc-debug-wq6s9"] Dec 06 16:00:42 crc kubenswrapper[4848]: I1206 16:00:42.922291 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qlx96/crc-debug-wq6s9"] Dec 06 16:00:42 crc kubenswrapper[4848]: I1206 16:00:42.977728 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="060ec9d2-6752-4521-8559-153f39829b3e" path="/var/lib/kubelet/pods/060ec9d2-6752-4521-8559-153f39829b3e/volumes" Dec 06 16:00:43 crc kubenswrapper[4848]: I1206 16:00:43.989897 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-wq6s9" Dec 06 16:00:44 crc kubenswrapper[4848]: I1206 16:00:44.123519 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5135505-51b1-48ed-9338-6cf5800570d6-host\") pod \"b5135505-51b1-48ed-9338-6cf5800570d6\" (UID: \"b5135505-51b1-48ed-9338-6cf5800570d6\") " Dec 06 16:00:44 crc kubenswrapper[4848]: I1206 16:00:44.123626 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5135505-51b1-48ed-9338-6cf5800570d6-host" (OuterVolumeSpecName: "host") pod "b5135505-51b1-48ed-9338-6cf5800570d6" (UID: "b5135505-51b1-48ed-9338-6cf5800570d6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 16:00:44 crc kubenswrapper[4848]: I1206 16:00:44.123713 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgfft\" (UniqueName: \"kubernetes.io/projected/b5135505-51b1-48ed-9338-6cf5800570d6-kube-api-access-tgfft\") pod \"b5135505-51b1-48ed-9338-6cf5800570d6\" (UID: \"b5135505-51b1-48ed-9338-6cf5800570d6\") " Dec 06 16:00:44 crc kubenswrapper[4848]: I1206 16:00:44.124263 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5135505-51b1-48ed-9338-6cf5800570d6-host\") on node \"crc\" DevicePath \"\"" Dec 06 16:00:44 crc kubenswrapper[4848]: I1206 16:00:44.129462 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5135505-51b1-48ed-9338-6cf5800570d6-kube-api-access-tgfft" (OuterVolumeSpecName: "kube-api-access-tgfft") pod "b5135505-51b1-48ed-9338-6cf5800570d6" (UID: "b5135505-51b1-48ed-9338-6cf5800570d6"). InnerVolumeSpecName "kube-api-access-tgfft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:00:44 crc kubenswrapper[4848]: I1206 16:00:44.225746 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgfft\" (UniqueName: \"kubernetes.io/projected/b5135505-51b1-48ed-9338-6cf5800570d6-kube-api-access-tgfft\") on node \"crc\" DevicePath \"\"" Dec 06 16:00:44 crc kubenswrapper[4848]: I1206 16:00:44.901601 4848 scope.go:117] "RemoveContainer" containerID="35eea3dc99cd22896104bd9d4e9b8d075f04a2e801439b1bca90e54a37d8cbf4" Dec 06 16:00:44 crc kubenswrapper[4848]: I1206 16:00:44.901648 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/crc-debug-wq6s9" Dec 06 16:00:44 crc kubenswrapper[4848]: I1206 16:00:44.976210 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5135505-51b1-48ed-9338-6cf5800570d6" path="/var/lib/kubelet/pods/b5135505-51b1-48ed-9338-6cf5800570d6/volumes" Dec 06 16:00:58 crc kubenswrapper[4848]: I1206 16:00:58.556812 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5668cb4c58-xxwrf_3ae8c267-b7dd-4336-bedb-11c1e1bae7c3/barbican-api/0.log" Dec 06 16:00:58 crc kubenswrapper[4848]: I1206 16:00:58.760890 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5668cb4c58-xxwrf_3ae8c267-b7dd-4336-bedb-11c1e1bae7c3/barbican-api-log/0.log" Dec 06 16:00:58 crc kubenswrapper[4848]: I1206 16:00:58.782977 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-84d7d7d8f8-jgnhx_318d0309-cf5f-4bfe-8c93-c72f13ce4a24/barbican-keystone-listener/0.log" Dec 06 16:00:58 crc kubenswrapper[4848]: I1206 16:00:58.950340 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-84d7d7d8f8-jgnhx_318d0309-cf5f-4bfe-8c93-c72f13ce4a24/barbican-keystone-listener-log/0.log" Dec 06 16:00:58 crc kubenswrapper[4848]: I1206 16:00:58.978729 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69c9c94d7-cmv75_e66ac417-22af-4413-afdd-d3b8006a5eb8/barbican-worker/0.log" Dec 06 16:00:59 crc kubenswrapper[4848]: I1206 16:00:59.083665 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69c9c94d7-cmv75_e66ac417-22af-4413-afdd-d3b8006a5eb8/barbican-worker-log/0.log" Dec 06 16:00:59 crc kubenswrapper[4848]: I1206 16:00:59.157244 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6a0084de-2a42-4cd7-a5ce-67c1770870b2/ceilometer-central-agent/0.log" Dec 06 16:00:59 crc kubenswrapper[4848]: I1206 16:00:59.222605 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6a0084de-2a42-4cd7-a5ce-67c1770870b2/ceilometer-notification-agent/0.log" Dec 06 16:00:59 crc kubenswrapper[4848]: I1206 16:00:59.270980 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6a0084de-2a42-4cd7-a5ce-67c1770870b2/proxy-httpd/0.log" Dec 06 16:00:59 crc kubenswrapper[4848]: I1206 16:00:59.368958 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6a0084de-2a42-4cd7-a5ce-67c1770870b2/sg-core/0.log" Dec 06 16:00:59 crc kubenswrapper[4848]: I1206 16:00:59.427269 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_61e60c86-1fae-4b73-9c2c-bb5bdd108630/cinder-api/0.log" Dec 06 16:00:59 crc kubenswrapper[4848]: I1206 16:00:59.462546 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_61e60c86-1fae-4b73-9c2c-bb5bdd108630/cinder-api-log/0.log" Dec 06 16:00:59 crc kubenswrapper[4848]: I1206 16:00:59.609281 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d5c9c312-22cc-49cf-b342-247cfd7b1906/probe/0.log" Dec 06 16:00:59 crc kubenswrapper[4848]: I1206 16:00:59.625738 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d5c9c312-22cc-49cf-b342-247cfd7b1906/cinder-scheduler/0.log" Dec 06 16:00:59 crc kubenswrapper[4848]: I1206 16:00:59.821867 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c7b6c5df9-qpndw_0fbe788b-df4c-456d-a2d3-b64abbf62ac7/init/0.log" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.035101 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c7b6c5df9-qpndw_0fbe788b-df4c-456d-a2d3-b64abbf62ac7/dnsmasq-dns/0.log" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.112162 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9f14ea76-e339-4e47-9063-898de1d2fac8/glance-httpd/0.log" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.115758 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c7b6c5df9-qpndw_0fbe788b-df4c-456d-a2d3-b64abbf62ac7/init/0.log" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.149213 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29417281-xvp8d"] Dec 06 16:01:00 crc kubenswrapper[4848]: E1206 16:01:00.149613 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5135505-51b1-48ed-9338-6cf5800570d6" containerName="container-00" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.149630 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5135505-51b1-48ed-9338-6cf5800570d6" containerName="container-00" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.149851 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5135505-51b1-48ed-9338-6cf5800570d6" containerName="container-00" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.150499 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.163529 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29417281-xvp8d"] Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.221548 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-fernet-keys\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.221613 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-combined-ca-bundle\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.221740 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdwdl\" (UniqueName: \"kubernetes.io/projected/dbabea74-70c4-42a2-aa24-2f3976b6b616-kube-api-access-qdwdl\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.221765 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-config-data\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.323845 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-fernet-keys\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.324241 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-combined-ca-bundle\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.324462 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdwdl\" (UniqueName: \"kubernetes.io/projected/dbabea74-70c4-42a2-aa24-2f3976b6b616-kube-api-access-qdwdl\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.324519 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-config-data\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.330595 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-fernet-keys\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.331507 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-combined-ca-bundle\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.348230 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-config-data\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.349791 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdwdl\" (UniqueName: \"kubernetes.io/projected/dbabea74-70c4-42a2-aa24-2f3976b6b616-kube-api-access-qdwdl\") pod \"keystone-cron-29417281-xvp8d\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.417998 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9f14ea76-e339-4e47-9063-898de1d2fac8/glance-log/0.log" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.465987 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_08ab771a-21e5-4145-8954-8ac8c039a8c4/glance-httpd/0.log" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.471933 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.587579 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_08ab771a-21e5-4145-8954-8ac8c039a8c4/glance-log/0.log" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.687931 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-59575bb9d8-57gb5_abe62341-68ac-438b-8aa5-4b0067c8c9ea/init/0.log" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.869464 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-59575bb9d8-57gb5_abe62341-68ac-438b-8aa5-4b0067c8c9ea/init/0.log" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.904124 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-59575bb9d8-57gb5_abe62341-68ac-438b-8aa5-4b0067c8c9ea/ironic-api-log/0.log" Dec 06 16:01:00 crc kubenswrapper[4848]: I1206 16:01:00.926728 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-59575bb9d8-57gb5_abe62341-68ac-438b-8aa5-4b0067c8c9ea/ironic-api/0.log" Dec 06 16:01:01 crc kubenswrapper[4848]: I1206 16:01:01.029769 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29417281-xvp8d"] Dec 06 16:01:01 crc kubenswrapper[4848]: I1206 16:01:01.046430 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29417281-xvp8d" event={"ID":"dbabea74-70c4-42a2-aa24-2f3976b6b616","Type":"ContainerStarted","Data":"881d817450630e9d55247c42c7a2674bd39789ec49e4aabebf4f376187c03173"} Dec 06 16:01:01 crc kubenswrapper[4848]: I1206 16:01:01.075127 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/init/0.log" Dec 06 16:01:01 crc kubenswrapper[4848]: I1206 16:01:01.286868 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ironic-python-agent-init/0.log" Dec 06 16:01:01 crc kubenswrapper[4848]: I1206 16:01:01.291460 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/init/0.log" Dec 06 16:01:01 crc kubenswrapper[4848]: I1206 16:01:01.340946 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ironic-python-agent-init/0.log" Dec 06 16:01:01 crc kubenswrapper[4848]: I1206 16:01:01.592550 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/init/0.log" Dec 06 16:01:01 crc kubenswrapper[4848]: I1206 16:01:01.639996 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ironic-python-agent-init/0.log" Dec 06 16:01:02 crc kubenswrapper[4848]: I1206 16:01:02.036349 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/init/0.log" Dec 06 16:01:02 crc kubenswrapper[4848]: I1206 16:01:02.059072 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29417281-xvp8d" event={"ID":"dbabea74-70c4-42a2-aa24-2f3976b6b616","Type":"ContainerStarted","Data":"d4b8ad7ff37c18eaab714e71fd58aa018acb11ed4845d612b7e657b65f713b1e"} Dec 06 16:01:02 crc kubenswrapper[4848]: I1206 16:01:02.077304 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29417281-xvp8d" podStartSLOduration=2.07728881 podStartE2EDuration="2.07728881s" podCreationTimestamp="2025-12-06 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 16:01:02.073171049 +0000 UTC m=+1929.371181962" watchObservedRunningTime="2025-12-06 16:01:02.07728881 +0000 UTC m=+1929.375299723" Dec 06 16:01:02 crc kubenswrapper[4848]: I1206 16:01:02.328233 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ironic-python-agent-init/0.log" Dec 06 16:01:02 crc kubenswrapper[4848]: I1206 16:01:02.482106 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/pxe-init/0.log" Dec 06 16:01:02 crc kubenswrapper[4848]: I1206 16:01:02.566879 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/httpboot/0.log" Dec 06 16:01:02 crc kubenswrapper[4848]: I1206 16:01:02.744844 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ramdisk-logs/0.log" Dec 06 16:01:02 crc kubenswrapper[4848]: I1206 16:01:02.803922 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ironic-conductor/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.013500 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-7xkpk_a75f41ed-628b-4e88-8d67-ada299f1c7a9/init/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.045459 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/pxe-init/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.048465 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-8k5vr"] Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.049651 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-8k5vr"] Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.206629 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-7xkpk_a75f41ed-628b-4e88-8d67-ada299f1c7a9/init/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.233101 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-7xkpk_a75f41ed-628b-4e88-8d67-ada299f1c7a9/ironic-db-sync/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.256503 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/pxe-init/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.417007 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ironic-python-agent-init/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.553796 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/pxe-init/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.597889 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ironic-python-agent-init/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.641143 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/inspector-pxe-init/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.683337 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/inspector-pxe-init/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.801726 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ironic-python-agent-init/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.807418 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/inspector-httpboot/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.847344 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/inspector-pxe-init/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.861442 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ironic-inspector/0.log" Dec 06 16:01:03 crc kubenswrapper[4848]: I1206 16:01:03.904435 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ironic-inspector-httpd/0.log" Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.015072 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ramdisk-logs/0.log" Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.018708 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-db-sync-6xvjg_fb815e0a-f0ff-40d4-b1c4-1220b71db056/ironic-inspector-db-sync/0.log" Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.074848 4848 generic.go:334] "Generic (PLEG): container finished" podID="dbabea74-70c4-42a2-aa24-2f3976b6b616" containerID="d4b8ad7ff37c18eaab714e71fd58aa018acb11ed4845d612b7e657b65f713b1e" exitCode=0 Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.074892 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29417281-xvp8d" event={"ID":"dbabea74-70c4-42a2-aa24-2f3976b6b616","Type":"ContainerDied","Data":"d4b8ad7ff37c18eaab714e71fd58aa018acb11ed4845d612b7e657b65f713b1e"} Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.088393 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-5f6db98496-rh44f_692f44d3-ff17-419f-b16c-b37f71521603/ironic-neutron-agent/2.log" Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.177725 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-5f6db98496-rh44f_692f44d3-ff17-419f-b16c-b37f71521603/ironic-neutron-agent/1.log" Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.336642 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e9aa43e5-ee37-49dc-8278-f2018f524c42/kube-state-metrics/0.log" Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.383730 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-84fbb4c9b8-bdccl_63d987d2-da9e-4cfb-b409-2b5c66f307f8/keystone-api/0.log" Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.608094 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7d47d5c9-wf778_aead053e-0f4a-48bf-b446-9a1dbdc7e996/neutron-httpd/0.log" Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.687814 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7d47d5c9-wf778_aead053e-0f4a-48bf-b446-9a1dbdc7e996/neutron-api/0.log" Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.918659 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_310ce79c-5eaa-461b-b99c-9e4aee7849c4/nova-api-log/0.log" Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.953994 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_310ce79c-5eaa-461b-b99c-9e4aee7849c4/nova-api-api/0.log" Dec 06 16:01:04 crc kubenswrapper[4848]: I1206 16:01:04.978726 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaf8dae4-61fd-4899-a2ea-07e6277f8c3f" path="/var/lib/kubelet/pods/aaf8dae4-61fd-4899-a2ea-07e6277f8c3f/volumes" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.256195 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-n4hw7_c999bbb5-2904-48f5-bfa0-48a0ce1692d7/nova-manage/0.log" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.293759 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ea901ebd-9d73-4ba0-8448-634e2b3d17f7/nova-cell0-conductor-conductor/0.log" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.476067 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.528336 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-config-data\") pod \"dbabea74-70c4-42a2-aa24-2f3976b6b616\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.528514 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdwdl\" (UniqueName: \"kubernetes.io/projected/dbabea74-70c4-42a2-aa24-2f3976b6b616-kube-api-access-qdwdl\") pod \"dbabea74-70c4-42a2-aa24-2f3976b6b616\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.528612 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-combined-ca-bundle\") pod \"dbabea74-70c4-42a2-aa24-2f3976b6b616\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.528638 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-fernet-keys\") pod \"dbabea74-70c4-42a2-aa24-2f3976b6b616\" (UID: \"dbabea74-70c4-42a2-aa24-2f3976b6b616\") " Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.533798 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dbabea74-70c4-42a2-aa24-2f3976b6b616" (UID: "dbabea74-70c4-42a2-aa24-2f3976b6b616"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.534513 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbabea74-70c4-42a2-aa24-2f3976b6b616-kube-api-access-qdwdl" (OuterVolumeSpecName: "kube-api-access-qdwdl") pod "dbabea74-70c4-42a2-aa24-2f3976b6b616" (UID: "dbabea74-70c4-42a2-aa24-2f3976b6b616"). InnerVolumeSpecName "kube-api-access-qdwdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.558893 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbabea74-70c4-42a2-aa24-2f3976b6b616" (UID: "dbabea74-70c4-42a2-aa24-2f3976b6b616"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.561745 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_40340eae-e441-4326-b678-265f2cd36d20/nova-cell1-conductor-conductor/0.log" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.564481 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-db-sync-8hbwm_d719bb86-9c8a-47a5-9b01-010f0ac07dac/nova-cell1-conductor-db-sync/0.log" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.606684 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-config-data" (OuterVolumeSpecName: "config-data") pod "dbabea74-70c4-42a2-aa24-2f3976b6b616" (UID: "dbabea74-70c4-42a2-aa24-2f3976b6b616"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.631231 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.631259 4848 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.631269 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbabea74-70c4-42a2-aa24-2f3976b6b616-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.631279 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdwdl\" (UniqueName: \"kubernetes.io/projected/dbabea74-70c4-42a2-aa24-2f3976b6b616-kube-api-access-qdwdl\") on node \"crc\" DevicePath \"\"" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.801932 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_789dd8b9-2530-46a7-b5ee-7276afa689fb/nova-cell1-novncproxy-novncproxy/0.log" Dec 06 16:01:05 crc kubenswrapper[4848]: I1206 16:01:05.873949 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_daebc9f0-bb2e-4deb-aa48-00553c450e81/nova-metadata-log/0.log" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.036936 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8hbwm"] Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.049268 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8hbwm"] Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.093413 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29417281-xvp8d" event={"ID":"dbabea74-70c4-42a2-aa24-2f3976b6b616","Type":"ContainerDied","Data":"881d817450630e9d55247c42c7a2674bd39789ec49e4aabebf4f376187c03173"} Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.093795 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="881d817450630e9d55247c42c7a2674bd39789ec49e4aabebf4f376187c03173" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.093453 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29417281-xvp8d" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.205261 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_daebc9f0-bb2e-4deb-aa48-00553c450e81/nova-metadata-metadata/0.log" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.212066 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4d6e9497-4228-4848-962e-e319a0c0fdf4/nova-scheduler-scheduler/0.log" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.304265 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bb6169da-9db4-4d22-bd22-aaf2322103df/mysql-bootstrap/0.log" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.437253 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bb6169da-9db4-4d22-bd22-aaf2322103df/mysql-bootstrap/0.log" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.475203 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bb6169da-9db4-4d22-bd22-aaf2322103df/galera/0.log" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.571835 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a1c66a34-c907-4841-92b1-0799522b6bd5/mysql-bootstrap/0.log" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.759915 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a1c66a34-c907-4841-92b1-0799522b6bd5/galera/0.log" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.785100 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a1c66a34-c907-4841-92b1-0799522b6bd5/mysql-bootstrap/0.log" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.839545 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_28b28ed8-c6be-4256-8ccd-8c560959048b/openstackclient/0.log" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.978591 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d719bb86-9c8a-47a5-9b01-010f0ac07dac" path="/var/lib/kubelet/pods/d719bb86-9c8a-47a5-9b01-010f0ac07dac/volumes" Dec 06 16:01:06 crc kubenswrapper[4848]: I1206 16:01:06.978912 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-g6wzf_93c0a1e4-91cd-4801-8439-a41fb872135f/ovn-controller/0.log" Dec 06 16:01:07 crc kubenswrapper[4848]: I1206 16:01:07.078330 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fl4pk_28348623-0697-417e-8f17-de443d77348c/openstack-network-exporter/0.log" Dec 06 16:01:07 crc kubenswrapper[4848]: I1206 16:01:07.215987 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcx5h_978d5b2d-7113-4b4e-944a-2681e5da434d/ovsdb-server-init/0.log" Dec 06 16:01:07 crc kubenswrapper[4848]: I1206 16:01:07.394162 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcx5h_978d5b2d-7113-4b4e-944a-2681e5da434d/ovsdb-server/0.log" Dec 06 16:01:07 crc kubenswrapper[4848]: I1206 16:01:07.395294 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcx5h_978d5b2d-7113-4b4e-944a-2681e5da434d/ovsdb-server-init/0.log" Dec 06 16:01:07 crc kubenswrapper[4848]: I1206 16:01:07.418210 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcx5h_978d5b2d-7113-4b4e-944a-2681e5da434d/ovs-vswitchd/0.log" Dec 06 16:01:07 crc kubenswrapper[4848]: I1206 16:01:07.563568 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_04a22b86-df7e-4426-aa1c-3f8c21c02354/openstack-network-exporter/0.log" Dec 06 16:01:07 crc kubenswrapper[4848]: I1206 16:01:07.576125 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_04a22b86-df7e-4426-aa1c-3f8c21c02354/ovn-northd/0.log" Dec 06 16:01:07 crc kubenswrapper[4848]: I1206 16:01:07.666762 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2928825b-3e1c-48cd-827e-afad27fe84c1/openstack-network-exporter/0.log" Dec 06 16:01:07 crc kubenswrapper[4848]: I1206 16:01:07.785513 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2928825b-3e1c-48cd-827e-afad27fe84c1/ovsdbserver-nb/0.log" Dec 06 16:01:07 crc kubenswrapper[4848]: I1206 16:01:07.889775 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d723dc9-fd9d-4b78-9fa6-c18e8656f634/openstack-network-exporter/0.log" Dec 06 16:01:07 crc kubenswrapper[4848]: I1206 16:01:07.891112 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d723dc9-fd9d-4b78-9fa6-c18e8656f634/ovsdbserver-sb/0.log" Dec 06 16:01:08 crc kubenswrapper[4848]: I1206 16:01:08.053416 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-b7b97f6b4-d9lkl_96280da8-11f8-49be-81a6-3bdcd053463f/placement-api/0.log" Dec 06 16:01:08 crc kubenswrapper[4848]: I1206 16:01:08.159127 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-b7b97f6b4-d9lkl_96280da8-11f8-49be-81a6-3bdcd053463f/placement-log/0.log" Dec 06 16:01:08 crc kubenswrapper[4848]: I1206 16:01:08.298933 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7/setup-container/0.log" Dec 06 16:01:08 crc kubenswrapper[4848]: I1206 16:01:08.454424 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7/setup-container/0.log" Dec 06 16:01:08 crc kubenswrapper[4848]: I1206 16:01:08.501213 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d58ead1c-d7f6-4643-a869-8566f5d9843b/setup-container/0.log" Dec 06 16:01:08 crc kubenswrapper[4848]: I1206 16:01:08.518208 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7/rabbitmq/0.log" Dec 06 16:01:08 crc kubenswrapper[4848]: I1206 16:01:08.697878 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d58ead1c-d7f6-4643-a869-8566f5d9843b/setup-container/0.log" Dec 06 16:01:08 crc kubenswrapper[4848]: I1206 16:01:08.821517 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d58ead1c-d7f6-4643-a869-8566f5d9843b/rabbitmq/0.log" Dec 06 16:01:08 crc kubenswrapper[4848]: I1206 16:01:08.827622 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76858ffddc-pvnks_6c86a3f4-dccd-48e8-9169-a63eaaded209/proxy-httpd/0.log" Dec 06 16:01:08 crc kubenswrapper[4848]: I1206 16:01:08.961604 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76858ffddc-pvnks_6c86a3f4-dccd-48e8-9169-a63eaaded209/proxy-server/0.log" Dec 06 16:01:08 crc kubenswrapper[4848]: I1206 16:01:08.996069 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-99ss6_6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d/swift-ring-rebalance/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.153488 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/account-auditor/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.206586 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/account-reaper/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.243720 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/account-replicator/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.301189 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/account-server/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.376409 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/container-auditor/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.421259 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/container-replicator/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.460765 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/container-server/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.561262 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/object-auditor/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.590096 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/container-updater/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.643884 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/object-expirer/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.704086 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/object-replicator/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.785111 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/object-server/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.813489 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/object-updater/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.908453 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/rsync/0.log" Dec 06 16:01:09 crc kubenswrapper[4848]: I1206 16:01:09.952465 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/swift-recon-cron/0.log" Dec 06 16:01:12 crc kubenswrapper[4848]: I1206 16:01:12.991612 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d62dd990-abf6-47e3-aafd-5e7efb0ab5c6/memcached/0.log" Dec 06 16:01:17 crc kubenswrapper[4848]: I1206 16:01:17.150607 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 16:01:17 crc kubenswrapper[4848]: I1206 16:01:17.151043 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 16:01:23 crc kubenswrapper[4848]: I1206 16:01:23.243640 4848 scope.go:117] "RemoveContainer" containerID="0b142f17068d8fdee513880326d48298aecb4fa3b2b3502488efe1e6f6753bf1" Dec 06 16:01:23 crc kubenswrapper[4848]: I1206 16:01:23.285154 4848 scope.go:117] "RemoveContainer" containerID="e37b77da47c99c43de099f8dcc3f0726f7596dbe4eeb8ca1761bc8c082d88f49" Dec 06 16:01:23 crc kubenswrapper[4848]: I1206 16:01:23.330922 4848 scope.go:117] "RemoveContainer" containerID="f6e1852ba42c8d6702ad8fb8a52d9026c79f96bf5ac698a3ad19be532e4185de" Dec 06 16:01:30 crc kubenswrapper[4848]: I1206 16:01:30.995604 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/util/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.060865 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/pull/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.079745 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/util/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.141676 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/pull/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.360283 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/pull/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.373888 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/util/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.374082 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/extract/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.526834 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-69cx9_706ced85-8889-45e9-bd15-1a2747a9de2e/kube-rbac-proxy/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.587910 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-plnw5_9418eec1-6430-4bb9-a7be-6ec83f61c629/kube-rbac-proxy/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.608946 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-69cx9_706ced85-8889-45e9-bd15-1a2747a9de2e/manager/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.726162 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-plnw5_9418eec1-6430-4bb9-a7be-6ec83f61c629/manager/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.788812 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-rhsqm_1438e750-61e2-4c37-8d03-f22b8ebad123/kube-rbac-proxy/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.810174 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-rhsqm_1438e750-61e2-4c37-8d03-f22b8ebad123/manager/0.log" Dec 06 16:01:31 crc kubenswrapper[4848]: I1206 16:01:31.941049 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-dzsgv_d99ce981-7f71-4636-94c7-c848830429f3/kube-rbac-proxy/0.log" Dec 06 16:01:32 crc kubenswrapper[4848]: I1206 16:01:32.051119 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-dzsgv_d99ce981-7f71-4636-94c7-c848830429f3/manager/0.log" Dec 06 16:01:32 crc kubenswrapper[4848]: I1206 16:01:32.128129 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-6vq9l_fd64b532-b259-49e5-bd47-62d9d20b6a69/manager/0.log" Dec 06 16:01:32 crc kubenswrapper[4848]: I1206 16:01:32.180865 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-6vq9l_fd64b532-b259-49e5-bd47-62d9d20b6a69/kube-rbac-proxy/0.log" Dec 06 16:01:32 crc kubenswrapper[4848]: I1206 16:01:32.255407 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jtn8k_1b49aafe-c450-441c-ade7-0d87b868dc2a/kube-rbac-proxy/0.log" Dec 06 16:01:32 crc kubenswrapper[4848]: I1206 16:01:32.330311 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jtn8k_1b49aafe-c450-441c-ade7-0d87b868dc2a/manager/0.log" Dec 06 16:01:32 crc kubenswrapper[4848]: I1206 16:01:32.400031 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-4k55x_cf4f4d25-7fc2-411f-9e23-71171162f38a/kube-rbac-proxy/0.log" Dec 06 16:01:32 crc kubenswrapper[4848]: I1206 16:01:32.637589 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54476ccddc-74npj_eecbdda5-b888-4fb9-979d-66bb4d8ffcf4/kube-rbac-proxy/0.log" Dec 06 16:01:32 crc kubenswrapper[4848]: I1206 16:01:32.651033 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54476ccddc-74npj_eecbdda5-b888-4fb9-979d-66bb4d8ffcf4/manager/0.log" Dec 06 16:01:32 crc kubenswrapper[4848]: I1206 16:01:32.706447 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-4k55x_cf4f4d25-7fc2-411f-9e23-71171162f38a/manager/0.log" Dec 06 16:01:32 crc kubenswrapper[4848]: I1206 16:01:32.817354 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4gqwc_fee1c62f-115d-472d-8617-32a386cf06c2/kube-rbac-proxy/0.log" Dec 06 16:01:32 crc kubenswrapper[4848]: I1206 16:01:32.907250 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4gqwc_fee1c62f-115d-472d-8617-32a386cf06c2/manager/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.018105 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-4vwqn_76bfd2a6-6774-4992-91ec-c73327b11bd8/manager/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.024731 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-4vwqn_76bfd2a6-6774-4992-91ec-c73327b11bd8/kube-rbac-proxy/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.122843 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-9vx7t_95e14baa-5bbb-4bf6-9420-0428c25cc98f/kube-rbac-proxy/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.217496 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-9vx7t_95e14baa-5bbb-4bf6-9420-0428c25cc98f/manager/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.430488 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vpn6w_7c78ca28-a1c2-45b9-9a14-733aae9ee555/kube-rbac-proxy/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.512745 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vpn6w_7c78ca28-a1c2-45b9-9a14-733aae9ee555/manager/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.557685 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-s6hbc_fa68cc1e-18ec-42d1-a3de-948ef2cc0804/kube-rbac-proxy/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.732004 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-s6hbc_fa68cc1e-18ec-42d1-a3de-948ef2cc0804/manager/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.734576 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9zcn8_0d42fecd-1b6d-4f29-816c-38515c0a547c/kube-rbac-proxy/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.748962 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9zcn8_0d42fecd-1b6d-4f29-816c-38515c0a547c/manager/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.918551 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fgtgmh_ec6dd72c-05cb-49f7-af2a-01a76807175c/kube-rbac-proxy/0.log" Dec 06 16:01:33 crc kubenswrapper[4848]: I1206 16:01:33.918948 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fgtgmh_ec6dd72c-05cb-49f7-af2a-01a76807175c/manager/0.log" Dec 06 16:01:34 crc kubenswrapper[4848]: I1206 16:01:34.306567 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-865d7c46f-s58q4_e10571bc-0d25-47b1-bcd1-272c89db1fb6/operator/0.log" Dec 06 16:01:34 crc kubenswrapper[4848]: I1206 16:01:34.327001 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-62kg6_10365f81-1470-441b-9533-24f6a526fe55/registry-server/0.log" Dec 06 16:01:34 crc kubenswrapper[4848]: I1206 16:01:34.540109 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-x7d2b_dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd/kube-rbac-proxy/0.log" Dec 06 16:01:34 crc kubenswrapper[4848]: I1206 16:01:34.591356 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-x7d2b_dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd/manager/0.log" Dec 06 16:01:34 crc kubenswrapper[4848]: I1206 16:01:34.722644 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bxhmf_90274bce-5d78-4f62-8265-2218bc58916d/kube-rbac-proxy/0.log" Dec 06 16:01:34 crc kubenswrapper[4848]: I1206 16:01:34.800073 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bxhmf_90274bce-5d78-4f62-8265-2218bc58916d/manager/0.log" Dec 06 16:01:35 crc kubenswrapper[4848]: I1206 16:01:35.006942 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pzg55_deacbe3a-30ba-42bb-a180-f8e2360ba937/operator/0.log" Dec 06 16:01:35 crc kubenswrapper[4848]: I1206 16:01:35.011019 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-5h8p9_d2b8bc9b-a359-4fe9-a039-200eed4f7218/kube-rbac-proxy/0.log" Dec 06 16:01:35 crc kubenswrapper[4848]: I1206 16:01:35.033123 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fcf4cdbd6-q2rkz_bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826/manager/0.log" Dec 06 16:01:35 crc kubenswrapper[4848]: I1206 16:01:35.076127 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-5h8p9_d2b8bc9b-a359-4fe9-a039-200eed4f7218/manager/0.log" Dec 06 16:01:35 crc kubenswrapper[4848]: I1206 16:01:35.220118 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-cwth7_cf3406a1-eb72-4251-bbe3-45a33235ac96/kube-rbac-proxy/0.log" Dec 06 16:01:35 crc kubenswrapper[4848]: I1206 16:01:35.269861 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-cwth7_cf3406a1-eb72-4251-bbe3-45a33235ac96/manager/0.log" Dec 06 16:01:35 crc kubenswrapper[4848]: I1206 16:01:35.285910 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-x7tzd_f59ce483-133a-495a-862c-3676661adab8/kube-rbac-proxy/0.log" Dec 06 16:01:35 crc kubenswrapper[4848]: I1206 16:01:35.401392 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-x7tzd_f59ce483-133a-495a-862c-3676661adab8/manager/0.log" Dec 06 16:01:35 crc kubenswrapper[4848]: I1206 16:01:35.418547 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-s6ttq_93dc2914-4eae-4bd4-a4c3-94122e44f908/manager/0.log" Dec 06 16:01:35 crc kubenswrapper[4848]: I1206 16:01:35.431683 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-s6ttq_93dc2914-4eae-4bd4-a4c3-94122e44f908/kube-rbac-proxy/0.log" Dec 06 16:01:47 crc kubenswrapper[4848]: I1206 16:01:47.040873 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-n4hw7"] Dec 06 16:01:47 crc kubenswrapper[4848]: I1206 16:01:47.048034 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-n4hw7"] Dec 06 16:01:47 crc kubenswrapper[4848]: I1206 16:01:47.150467 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 16:01:47 crc kubenswrapper[4848]: I1206 16:01:47.150557 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 16:01:48 crc kubenswrapper[4848]: I1206 16:01:48.977780 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c999bbb5-2904-48f5-bfa0-48a0ce1692d7" path="/var/lib/kubelet/pods/c999bbb5-2904-48f5-bfa0-48a0ce1692d7/volumes" Dec 06 16:01:52 crc kubenswrapper[4848]: I1206 16:01:52.042273 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-6wjr7_204ca83b-b95a-451a-bc43-a46bf0f2859d/control-plane-machine-set-operator/0.log" Dec 06 16:01:52 crc kubenswrapper[4848]: I1206 16:01:52.151779 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7h7ps_4a6e4efa-abe1-44da-9528-73fe113e016a/kube-rbac-proxy/0.log" Dec 06 16:01:52 crc kubenswrapper[4848]: I1206 16:01:52.198807 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7h7ps_4a6e4efa-abe1-44da-9528-73fe113e016a/machine-api-operator/0.log" Dec 06 16:02:02 crc kubenswrapper[4848]: I1206 16:02:02.883881 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-q9bg5_be7d17e4-0e3a-4a56-9a2f-d8a7ab9b0960/cert-manager-controller/0.log" Dec 06 16:02:03 crc kubenswrapper[4848]: I1206 16:02:03.025307 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-zlwlz_96eace79-3285-4aef-902d-aed98f97663c/cert-manager-cainjector/0.log" Dec 06 16:02:03 crc kubenswrapper[4848]: I1206 16:02:03.067238 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-pwsfx_bd79a30b-a387-4c71-9415-5d8c8a20cd63/cert-manager-webhook/0.log" Dec 06 16:02:13 crc kubenswrapper[4848]: I1206 16:02:13.744844 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-f5s7t_ee3d85c7-86b8-4eb5-9832-65313a05b1a4/nmstate-console-plugin/0.log" Dec 06 16:02:13 crc kubenswrapper[4848]: I1206 16:02:13.902540 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-nfpfl_b6ab59ca-8a07-452a-bc3a-b071fccdc3ff/nmstate-metrics/0.log" Dec 06 16:02:13 crc kubenswrapper[4848]: I1206 16:02:13.913819 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-nfpfl_b6ab59ca-8a07-452a-bc3a-b071fccdc3ff/kube-rbac-proxy/0.log" Dec 06 16:02:13 crc kubenswrapper[4848]: I1206 16:02:13.920519 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-j5wcn_49d9817d-6554-407c-9842-c044929eb803/nmstate-handler/0.log" Dec 06 16:02:14 crc kubenswrapper[4848]: I1206 16:02:14.074017 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-5m7pn_3ad4635f-e66a-4dee-a97b-e2b94ae72319/nmstate-operator/0.log" Dec 06 16:02:14 crc kubenswrapper[4848]: I1206 16:02:14.122122 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-8hj7l_54fb82cb-0b0a-42e8-99dc-3df8b471bd89/nmstate-webhook/0.log" Dec 06 16:02:17 crc kubenswrapper[4848]: I1206 16:02:17.150245 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 16:02:17 crc kubenswrapper[4848]: I1206 16:02:17.150836 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 16:02:17 crc kubenswrapper[4848]: I1206 16:02:17.150884 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 16:02:17 crc kubenswrapper[4848]: I1206 16:02:17.151638 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e074eab4c8a7163650caab1136ef46934cfa12f99d31416b70294a37f45c54d"} pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 16:02:17 crc kubenswrapper[4848]: I1206 16:02:17.151688 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" containerID="cri-o://2e074eab4c8a7163650caab1136ef46934cfa12f99d31416b70294a37f45c54d" gracePeriod=600 Dec 06 16:02:17 crc kubenswrapper[4848]: I1206 16:02:17.734390 4848 generic.go:334] "Generic (PLEG): container finished" podID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerID="2e074eab4c8a7163650caab1136ef46934cfa12f99d31416b70294a37f45c54d" exitCode=0 Dec 06 16:02:17 crc kubenswrapper[4848]: I1206 16:02:17.734481 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerDied","Data":"2e074eab4c8a7163650caab1136ef46934cfa12f99d31416b70294a37f45c54d"} Dec 06 16:02:17 crc kubenswrapper[4848]: I1206 16:02:17.734731 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5"} Dec 06 16:02:17 crc kubenswrapper[4848]: I1206 16:02:17.734753 4848 scope.go:117] "RemoveContainer" containerID="2ae3cad303aea5a3230e3abfc5c03536f67440f3aa3e5ac94b1db6e3252ee3d8" Dec 06 16:02:23 crc kubenswrapper[4848]: I1206 16:02:23.470877 4848 scope.go:117] "RemoveContainer" containerID="ff268bc4d2434693e8ae701076e75a12788362c1cd5f3b4d79696327851e1617" Dec 06 16:02:26 crc kubenswrapper[4848]: I1206 16:02:26.869920 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-9xn7s_53fefa8d-ef28-4a20-8e75-b633d64f4863/kube-rbac-proxy/0.log" Dec 06 16:02:26 crc kubenswrapper[4848]: I1206 16:02:26.951374 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-9xn7s_53fefa8d-ef28-4a20-8e75-b633d64f4863/controller/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.127539 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-frr-files/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.323460 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-metrics/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.325030 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-frr-files/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.347663 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-reloader/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.347827 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-reloader/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.502499 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-reloader/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.535297 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-frr-files/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.561014 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-metrics/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.562624 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-metrics/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.577581 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zc7cx"] Dec 06 16:02:27 crc kubenswrapper[4848]: E1206 16:02:27.577971 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbabea74-70c4-42a2-aa24-2f3976b6b616" containerName="keystone-cron" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.577986 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbabea74-70c4-42a2-aa24-2f3976b6b616" containerName="keystone-cron" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.578192 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbabea74-70c4-42a2-aa24-2f3976b6b616" containerName="keystone-cron" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.580372 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.592683 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zc7cx"] Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.735472 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-catalog-content\") pod \"community-operators-zc7cx\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.735582 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d98p\" (UniqueName: \"kubernetes.io/projected/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-kube-api-access-4d98p\") pod \"community-operators-zc7cx\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.735650 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-utilities\") pod \"community-operators-zc7cx\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.750180 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-frr-files/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.769592 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-reloader/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.823916 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/controller/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.836991 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-metrics/0.log" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.837375 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d98p\" (UniqueName: \"kubernetes.io/projected/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-kube-api-access-4d98p\") pod \"community-operators-zc7cx\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.837529 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-utilities\") pod \"community-operators-zc7cx\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.837776 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-catalog-content\") pod \"community-operators-zc7cx\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.838064 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-utilities\") pod \"community-operators-zc7cx\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.838184 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-catalog-content\") pod \"community-operators-zc7cx\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.859808 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d98p\" (UniqueName: \"kubernetes.io/projected/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-kube-api-access-4d98p\") pod \"community-operators-zc7cx\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:27 crc kubenswrapper[4848]: I1206 16:02:27.903439 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:28 crc kubenswrapper[4848]: I1206 16:02:28.287341 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/kube-rbac-proxy/0.log" Dec 06 16:02:28 crc kubenswrapper[4848]: I1206 16:02:28.296352 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/frr-metrics/0.log" Dec 06 16:02:28 crc kubenswrapper[4848]: I1206 16:02:28.435471 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/kube-rbac-proxy-frr/0.log" Dec 06 16:02:28 crc kubenswrapper[4848]: I1206 16:02:28.531647 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/reloader/0.log" Dec 06 16:02:28 crc kubenswrapper[4848]: I1206 16:02:28.569151 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zc7cx"] Dec 06 16:02:28 crc kubenswrapper[4848]: I1206 16:02:28.719412 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-ptrrq_99f1be97-6216-4892-8da4-32dac60daaaa/frr-k8s-webhook-server/0.log" Dec 06 16:02:28 crc kubenswrapper[4848]: I1206 16:02:28.762678 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-d8c5d4748-tk9lq_b31b58c9-49c3-4e31-b838-5532169c319b/manager/0.log" Dec 06 16:02:28 crc kubenswrapper[4848]: I1206 16:02:28.827796 4848 generic.go:334] "Generic (PLEG): container finished" podID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" containerID="b41c2aa211d7b9d502623bca8238dde96bda5501c52d3ee68c1a4f6677d09e76" exitCode=0 Dec 06 16:02:28 crc kubenswrapper[4848]: I1206 16:02:28.827839 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zc7cx" event={"ID":"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d","Type":"ContainerDied","Data":"b41c2aa211d7b9d502623bca8238dde96bda5501c52d3ee68c1a4f6677d09e76"} Dec 06 16:02:28 crc kubenswrapper[4848]: I1206 16:02:28.827866 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zc7cx" event={"ID":"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d","Type":"ContainerStarted","Data":"781d811434163732858638119a7cc1f6b517e333c5738846428277385208cc00"} Dec 06 16:02:28 crc kubenswrapper[4848]: I1206 16:02:28.936829 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57cbc9df7c-kltd9_d03f9739-5d90-45b0-a717-bc66f0522234/webhook-server/0.log" Dec 06 16:02:29 crc kubenswrapper[4848]: I1206 16:02:29.170416 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4b5lc_1649535c-6c66-412c-be24-f452edfe82a1/kube-rbac-proxy/0.log" Dec 06 16:02:29 crc kubenswrapper[4848]: I1206 16:02:29.172167 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/frr/0.log" Dec 06 16:02:29 crc kubenswrapper[4848]: I1206 16:02:29.461039 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4b5lc_1649535c-6c66-412c-be24-f452edfe82a1/speaker/0.log" Dec 06 16:02:29 crc kubenswrapper[4848]: I1206 16:02:29.837884 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zc7cx" event={"ID":"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d","Type":"ContainerStarted","Data":"c1f9f27aa9959b08cd70177f9d7db2cc51df1b7a9c4bc862da9a982350975c86"} Dec 06 16:02:30 crc kubenswrapper[4848]: I1206 16:02:30.849385 4848 generic.go:334] "Generic (PLEG): container finished" podID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" containerID="c1f9f27aa9959b08cd70177f9d7db2cc51df1b7a9c4bc862da9a982350975c86" exitCode=0 Dec 06 16:02:30 crc kubenswrapper[4848]: I1206 16:02:30.849446 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zc7cx" event={"ID":"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d","Type":"ContainerDied","Data":"c1f9f27aa9959b08cd70177f9d7db2cc51df1b7a9c4bc862da9a982350975c86"} Dec 06 16:02:31 crc kubenswrapper[4848]: I1206 16:02:31.859447 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zc7cx" event={"ID":"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d","Type":"ContainerStarted","Data":"3a369659f297c88d94fcfd4107c6d43b37d706cfce25cd53bb401a04f9c24bdf"} Dec 06 16:02:37 crc kubenswrapper[4848]: I1206 16:02:37.904289 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:37 crc kubenswrapper[4848]: I1206 16:02:37.904837 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:37 crc kubenswrapper[4848]: I1206 16:02:37.959376 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:37 crc kubenswrapper[4848]: I1206 16:02:37.977921 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zc7cx" podStartSLOduration=8.259686887 podStartE2EDuration="10.97790567s" podCreationTimestamp="2025-12-06 16:02:27 +0000 UTC" firstStartedPulling="2025-12-06 16:02:28.829505428 +0000 UTC m=+2016.127516341" lastFinishedPulling="2025-12-06 16:02:31.547724211 +0000 UTC m=+2018.845735124" observedRunningTime="2025-12-06 16:02:31.875905353 +0000 UTC m=+2019.173916276" watchObservedRunningTime="2025-12-06 16:02:37.97790567 +0000 UTC m=+2025.275916583" Dec 06 16:02:38 crc kubenswrapper[4848]: I1206 16:02:38.958478 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:40 crc kubenswrapper[4848]: I1206 16:02:40.375179 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zc7cx"] Dec 06 16:02:40 crc kubenswrapper[4848]: I1206 16:02:40.939123 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zc7cx" podUID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" containerName="registry-server" containerID="cri-o://3a369659f297c88d94fcfd4107c6d43b37d706cfce25cd53bb401a04f9c24bdf" gracePeriod=2 Dec 06 16:02:42 crc kubenswrapper[4848]: I1206 16:02:42.049013 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/util/0.log" Dec 06 16:02:42 crc kubenswrapper[4848]: I1206 16:02:42.050342 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/pull/0.log" Dec 06 16:02:42 crc kubenswrapper[4848]: I1206 16:02:42.049297 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/util/0.log" Dec 06 16:02:42 crc kubenswrapper[4848]: I1206 16:02:42.050400 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/pull/0.log" Dec 06 16:02:42 crc kubenswrapper[4848]: I1206 16:02:42.357310 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/extract/0.log" Dec 06 16:02:42 crc kubenswrapper[4848]: I1206 16:02:42.505116 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/pull/0.log" Dec 06 16:02:42 crc kubenswrapper[4848]: I1206 16:02:42.537032 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/util/0.log" Dec 06 16:02:42 crc kubenswrapper[4848]: I1206 16:02:42.918194 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/util/0.log" Dec 06 16:02:42 crc kubenswrapper[4848]: I1206 16:02:42.919921 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/util/0.log" Dec 06 16:02:43 crc kubenswrapper[4848]: I1206 16:02:43.141044 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/pull/0.log" Dec 06 16:02:43 crc kubenswrapper[4848]: I1206 16:02:43.141117 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/pull/0.log" Dec 06 16:02:43 crc kubenswrapper[4848]: I1206 16:02:43.368260 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/pull/0.log" Dec 06 16:02:43 crc kubenswrapper[4848]: I1206 16:02:43.453298 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/util/0.log" Dec 06 16:02:43 crc kubenswrapper[4848]: I1206 16:02:43.469877 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/extract/0.log" Dec 06 16:02:43 crc kubenswrapper[4848]: I1206 16:02:43.615278 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-utilities/0.log" Dec 06 16:02:43 crc kubenswrapper[4848]: I1206 16:02:43.793374 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-content/0.log" Dec 06 16:02:43 crc kubenswrapper[4848]: I1206 16:02:43.797323 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-content/0.log" Dec 06 16:02:43 crc kubenswrapper[4848]: I1206 16:02:43.799570 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-utilities/0.log" Dec 06 16:02:43 crc kubenswrapper[4848]: I1206 16:02:43.967247 4848 generic.go:334] "Generic (PLEG): container finished" podID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" containerID="3a369659f297c88d94fcfd4107c6d43b37d706cfce25cd53bb401a04f9c24bdf" exitCode=0 Dec 06 16:02:43 crc kubenswrapper[4848]: I1206 16:02:43.967597 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zc7cx" event={"ID":"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d","Type":"ContainerDied","Data":"3a369659f297c88d94fcfd4107c6d43b37d706cfce25cd53bb401a04f9c24bdf"} Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.000869 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-utilities/0.log" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.036313 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-content/0.log" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.107845 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.174710 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-utilities\") pod \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.175855 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d98p\" (UniqueName: \"kubernetes.io/projected/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-kube-api-access-4d98p\") pod \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.176152 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-catalog-content\") pod \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\" (UID: \"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d\") " Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.184675 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-utilities" (OuterVolumeSpecName: "utilities") pod "d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" (UID: "d38273e1-d7b4-4be7-af9e-eaec81e3bc9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.230079 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-kube-api-access-4d98p" (OuterVolumeSpecName: "kube-api-access-4d98p") pod "d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" (UID: "d38273e1-d7b4-4be7-af9e-eaec81e3bc9d"). InnerVolumeSpecName "kube-api-access-4d98p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.246971 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" (UID: "d38273e1-d7b4-4be7-af9e-eaec81e3bc9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.294663 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-utilities/0.log" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.302817 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d98p\" (UniqueName: \"kubernetes.io/projected/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-kube-api-access-4d98p\") on node \"crc\" DevicePath \"\"" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.302854 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.302872 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.438602 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/registry-server/0.log" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.453430 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-content/0.log" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.469750 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-utilities/0.log" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.538683 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-content/0.log" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.668092 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-content/0.log" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.687529 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-utilities/0.log" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.886025 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zc7cx_d38273e1-d7b4-4be7-af9e-eaec81e3bc9d/extract-utilities/0.log" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.979779 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zc7cx" Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.982388 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zc7cx" event={"ID":"d38273e1-d7b4-4be7-af9e-eaec81e3bc9d","Type":"ContainerDied","Data":"781d811434163732858638119a7cc1f6b517e333c5738846428277385208cc00"} Dec 06 16:02:44 crc kubenswrapper[4848]: I1206 16:02:44.982440 4848 scope.go:117] "RemoveContainer" containerID="3a369659f297c88d94fcfd4107c6d43b37d706cfce25cd53bb401a04f9c24bdf" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.022536 4848 scope.go:117] "RemoveContainer" containerID="c1f9f27aa9959b08cd70177f9d7db2cc51df1b7a9c4bc862da9a982350975c86" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.026600 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/registry-server/0.log" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.027946 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zc7cx"] Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.041746 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zc7cx"] Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.053904 4848 scope.go:117] "RemoveContainer" containerID="b41c2aa211d7b9d502623bca8238dde96bda5501c52d3ee68c1a4f6677d09e76" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.356956 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zgk8b_6c0af646-ef02-4709-9e31-cd29cd07fa4a/marketplace-operator/0.log" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.367977 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-utilities/0.log" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.587935 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-utilities/0.log" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.588174 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-content/0.log" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.599734 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-content/0.log" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.791407 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-utilities/0.log" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.841309 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-utilities/0.log" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.856224 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-content/0.log" Dec 06 16:02:45 crc kubenswrapper[4848]: I1206 16:02:45.868376 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/registry-server/0.log" Dec 06 16:02:46 crc kubenswrapper[4848]: I1206 16:02:46.043022 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-content/0.log" Dec 06 16:02:46 crc kubenswrapper[4848]: I1206 16:02:46.056583 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-content/0.log" Dec 06 16:02:46 crc kubenswrapper[4848]: I1206 16:02:46.057115 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-utilities/0.log" Dec 06 16:02:46 crc kubenswrapper[4848]: I1206 16:02:46.212776 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-utilities/0.log" Dec 06 16:02:46 crc kubenswrapper[4848]: I1206 16:02:46.217270 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-content/0.log" Dec 06 16:02:46 crc kubenswrapper[4848]: I1206 16:02:46.548657 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/registry-server/0.log" Dec 06 16:02:46 crc kubenswrapper[4848]: I1206 16:02:46.978716 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" path="/var/lib/kubelet/pods/d38273e1-d7b4-4be7-af9e-eaec81e3bc9d/volumes" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.300855 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kl4qp"] Dec 06 16:04:10 crc kubenswrapper[4848]: E1206 16:04:10.301852 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" containerName="extract-content" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.301872 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" containerName="extract-content" Dec 06 16:04:10 crc kubenswrapper[4848]: E1206 16:04:10.301887 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" containerName="registry-server" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.301892 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" containerName="registry-server" Dec 06 16:04:10 crc kubenswrapper[4848]: E1206 16:04:10.301915 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" containerName="extract-utilities" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.301921 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" containerName="extract-utilities" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.302113 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="d38273e1-d7b4-4be7-af9e-eaec81e3bc9d" containerName="registry-server" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.303622 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.315142 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kl4qp"] Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.367872 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-catalog-content\") pod \"certified-operators-kl4qp\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.368011 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4p4\" (UniqueName: \"kubernetes.io/projected/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-kube-api-access-td4p4\") pod \"certified-operators-kl4qp\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.368117 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-utilities\") pod \"certified-operators-kl4qp\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.469597 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-utilities\") pod \"certified-operators-kl4qp\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.469732 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-catalog-content\") pod \"certified-operators-kl4qp\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.469905 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td4p4\" (UniqueName: \"kubernetes.io/projected/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-kube-api-access-td4p4\") pod \"certified-operators-kl4qp\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.470298 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-utilities\") pod \"certified-operators-kl4qp\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.470596 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-catalog-content\") pod \"certified-operators-kl4qp\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.497874 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4p4\" (UniqueName: \"kubernetes.io/projected/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-kube-api-access-td4p4\") pod \"certified-operators-kl4qp\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:10 crc kubenswrapper[4848]: I1206 16:04:10.639172 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:11 crc kubenswrapper[4848]: I1206 16:04:11.374725 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kl4qp"] Dec 06 16:04:12 crc kubenswrapper[4848]: I1206 16:04:12.144935 4848 generic.go:334] "Generic (PLEG): container finished" podID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" containerID="d8d1d3e3041b2363716f85aa40501ea4e572b60e206b643a3654ebbf5f71fdcd" exitCode=0 Dec 06 16:04:12 crc kubenswrapper[4848]: I1206 16:04:12.144980 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kl4qp" event={"ID":"7aceac04-0c82-4673-ad9d-5dd0cc04b99e","Type":"ContainerDied","Data":"d8d1d3e3041b2363716f85aa40501ea4e572b60e206b643a3654ebbf5f71fdcd"} Dec 06 16:04:12 crc kubenswrapper[4848]: I1206 16:04:12.145156 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kl4qp" event={"ID":"7aceac04-0c82-4673-ad9d-5dd0cc04b99e","Type":"ContainerStarted","Data":"de09f56c902d81da0b7b542c58a4ef06075c22ec0636012ba54134ea3d9637c2"} Dec 06 16:04:12 crc kubenswrapper[4848]: I1206 16:04:12.146688 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 16:04:13 crc kubenswrapper[4848]: I1206 16:04:13.155199 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kl4qp" event={"ID":"7aceac04-0c82-4673-ad9d-5dd0cc04b99e","Type":"ContainerStarted","Data":"418ce74fc58b6e79f262a79a50e2e8f477da469c3cc3379cf707eb090d7f3ceb"} Dec 06 16:04:14 crc kubenswrapper[4848]: I1206 16:04:14.164582 4848 generic.go:334] "Generic (PLEG): container finished" podID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" containerID="418ce74fc58b6e79f262a79a50e2e8f477da469c3cc3379cf707eb090d7f3ceb" exitCode=0 Dec 06 16:04:14 crc kubenswrapper[4848]: I1206 16:04:14.164931 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kl4qp" event={"ID":"7aceac04-0c82-4673-ad9d-5dd0cc04b99e","Type":"ContainerDied","Data":"418ce74fc58b6e79f262a79a50e2e8f477da469c3cc3379cf707eb090d7f3ceb"} Dec 06 16:04:15 crc kubenswrapper[4848]: I1206 16:04:15.179410 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kl4qp" event={"ID":"7aceac04-0c82-4673-ad9d-5dd0cc04b99e","Type":"ContainerStarted","Data":"2bed8fccd80ce8ceb370b972a5252a2b147f0faae0e6788fbb3298d18ea2b389"} Dec 06 16:04:15 crc kubenswrapper[4848]: I1206 16:04:15.202856 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kl4qp" podStartSLOduration=2.801625371 podStartE2EDuration="5.202836366s" podCreationTimestamp="2025-12-06 16:04:10 +0000 UTC" firstStartedPulling="2025-12-06 16:04:12.146460411 +0000 UTC m=+2119.444471324" lastFinishedPulling="2025-12-06 16:04:14.547671406 +0000 UTC m=+2121.845682319" observedRunningTime="2025-12-06 16:04:15.199638309 +0000 UTC m=+2122.497649222" watchObservedRunningTime="2025-12-06 16:04:15.202836366 +0000 UTC m=+2122.500847279" Dec 06 16:04:17 crc kubenswrapper[4848]: I1206 16:04:17.150231 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 16:04:17 crc kubenswrapper[4848]: I1206 16:04:17.150557 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 16:04:17 crc kubenswrapper[4848]: I1206 16:04:17.734592 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qzchp"] Dec 06 16:04:17 crc kubenswrapper[4848]: I1206 16:04:17.739797 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:17 crc kubenswrapper[4848]: I1206 16:04:17.745424 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzchp"] Dec 06 16:04:17 crc kubenswrapper[4848]: I1206 16:04:17.918274 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-catalog-content\") pod \"redhat-marketplace-qzchp\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:17 crc kubenswrapper[4848]: I1206 16:04:17.918332 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr9c8\" (UniqueName: \"kubernetes.io/projected/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-kube-api-access-pr9c8\") pod \"redhat-marketplace-qzchp\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:17 crc kubenswrapper[4848]: I1206 16:04:17.918440 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-utilities\") pod \"redhat-marketplace-qzchp\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:18 crc kubenswrapper[4848]: I1206 16:04:18.020353 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr9c8\" (UniqueName: \"kubernetes.io/projected/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-kube-api-access-pr9c8\") pod \"redhat-marketplace-qzchp\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:18 crc kubenswrapper[4848]: I1206 16:04:18.020494 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-utilities\") pod \"redhat-marketplace-qzchp\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:18 crc kubenswrapper[4848]: I1206 16:04:18.020657 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-catalog-content\") pod \"redhat-marketplace-qzchp\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:18 crc kubenswrapper[4848]: I1206 16:04:18.021219 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-catalog-content\") pod \"redhat-marketplace-qzchp\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:18 crc kubenswrapper[4848]: I1206 16:04:18.021934 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-utilities\") pod \"redhat-marketplace-qzchp\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:18 crc kubenswrapper[4848]: I1206 16:04:18.051122 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr9c8\" (UniqueName: \"kubernetes.io/projected/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-kube-api-access-pr9c8\") pod \"redhat-marketplace-qzchp\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:18 crc kubenswrapper[4848]: I1206 16:04:18.070794 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:18 crc kubenswrapper[4848]: I1206 16:04:18.505083 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzchp"] Dec 06 16:04:18 crc kubenswrapper[4848]: W1206 16:04:18.507577 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ae1027_67d0_4ae2_be4c_fbfcb5f243e0.slice/crio-c3c124b2b6d20a9cd559fd8e7a94dfa3b0f23d01c80295f8f103d8ba4d6a1939 WatchSource:0}: Error finding container c3c124b2b6d20a9cd559fd8e7a94dfa3b0f23d01c80295f8f103d8ba4d6a1939: Status 404 returned error can't find the container with id c3c124b2b6d20a9cd559fd8e7a94dfa3b0f23d01c80295f8f103d8ba4d6a1939 Dec 06 16:04:19 crc kubenswrapper[4848]: I1206 16:04:19.215730 4848 generic.go:334] "Generic (PLEG): container finished" podID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" containerID="b1b22c1b55db098ab2b812ef085e353669cf2799ca43d71d187ae17bce78c574" exitCode=0 Dec 06 16:04:19 crc kubenswrapper[4848]: I1206 16:04:19.215783 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzchp" event={"ID":"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0","Type":"ContainerDied","Data":"b1b22c1b55db098ab2b812ef085e353669cf2799ca43d71d187ae17bce78c574"} Dec 06 16:04:19 crc kubenswrapper[4848]: I1206 16:04:19.216066 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzchp" event={"ID":"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0","Type":"ContainerStarted","Data":"c3c124b2b6d20a9cd559fd8e7a94dfa3b0f23d01c80295f8f103d8ba4d6a1939"} Dec 06 16:04:20 crc kubenswrapper[4848]: I1206 16:04:20.225483 4848 generic.go:334] "Generic (PLEG): container finished" podID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" containerID="f0532b7d8caa63562aa887966d491e6317108bfdf474203d9e3016de6bd44253" exitCode=0 Dec 06 16:04:20 crc kubenswrapper[4848]: I1206 16:04:20.225530 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzchp" event={"ID":"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0","Type":"ContainerDied","Data":"f0532b7d8caa63562aa887966d491e6317108bfdf474203d9e3016de6bd44253"} Dec 06 16:04:20 crc kubenswrapper[4848]: I1206 16:04:20.640149 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:20 crc kubenswrapper[4848]: I1206 16:04:20.640460 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:20 crc kubenswrapper[4848]: I1206 16:04:20.686724 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:21 crc kubenswrapper[4848]: I1206 16:04:21.235994 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzchp" event={"ID":"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0","Type":"ContainerStarted","Data":"e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f"} Dec 06 16:04:21 crc kubenswrapper[4848]: I1206 16:04:21.258453 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qzchp" podStartSLOduration=2.832034003 podStartE2EDuration="4.258435938s" podCreationTimestamp="2025-12-06 16:04:17 +0000 UTC" firstStartedPulling="2025-12-06 16:04:19.21774457 +0000 UTC m=+2126.515755483" lastFinishedPulling="2025-12-06 16:04:20.644146505 +0000 UTC m=+2127.942157418" observedRunningTime="2025-12-06 16:04:21.25300088 +0000 UTC m=+2128.551011793" watchObservedRunningTime="2025-12-06 16:04:21.258435938 +0000 UTC m=+2128.556446851" Dec 06 16:04:21 crc kubenswrapper[4848]: I1206 16:04:21.295638 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:23 crc kubenswrapper[4848]: I1206 16:04:23.091123 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kl4qp"] Dec 06 16:04:23 crc kubenswrapper[4848]: I1206 16:04:23.251120 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kl4qp" podUID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" containerName="registry-server" containerID="cri-o://2bed8fccd80ce8ceb370b972a5252a2b147f0faae0e6788fbb3298d18ea2b389" gracePeriod=2 Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.271166 4848 generic.go:334] "Generic (PLEG): container finished" podID="71bd67b7-42c5-4384-8431-07f05f3ae0a1" containerID="ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60" exitCode=0 Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.271248 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qlx96/must-gather-7jnlk" event={"ID":"71bd67b7-42c5-4384-8431-07f05f3ae0a1","Type":"ContainerDied","Data":"ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60"} Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.272375 4848 scope.go:117] "RemoveContainer" containerID="ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60" Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.275037 4848 generic.go:334] "Generic (PLEG): container finished" podID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" containerID="2bed8fccd80ce8ceb370b972a5252a2b147f0faae0e6788fbb3298d18ea2b389" exitCode=0 Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.275077 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kl4qp" event={"ID":"7aceac04-0c82-4673-ad9d-5dd0cc04b99e","Type":"ContainerDied","Data":"2bed8fccd80ce8ceb370b972a5252a2b147f0faae0e6788fbb3298d18ea2b389"} Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.619404 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qlx96_must-gather-7jnlk_71bd67b7-42c5-4384-8431-07f05f3ae0a1/gather/0.log" Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.675861 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.812891 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-catalog-content\") pod \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.813107 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td4p4\" (UniqueName: \"kubernetes.io/projected/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-kube-api-access-td4p4\") pod \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.813262 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-utilities\") pod \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\" (UID: \"7aceac04-0c82-4673-ad9d-5dd0cc04b99e\") " Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.814209 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-utilities" (OuterVolumeSpecName: "utilities") pod "7aceac04-0c82-4673-ad9d-5dd0cc04b99e" (UID: "7aceac04-0c82-4673-ad9d-5dd0cc04b99e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.818058 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-kube-api-access-td4p4" (OuterVolumeSpecName: "kube-api-access-td4p4") pod "7aceac04-0c82-4673-ad9d-5dd0cc04b99e" (UID: "7aceac04-0c82-4673-ad9d-5dd0cc04b99e"). InnerVolumeSpecName "kube-api-access-td4p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.865115 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7aceac04-0c82-4673-ad9d-5dd0cc04b99e" (UID: "7aceac04-0c82-4673-ad9d-5dd0cc04b99e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.915582 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.916006 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td4p4\" (UniqueName: \"kubernetes.io/projected/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-kube-api-access-td4p4\") on node \"crc\" DevicePath \"\"" Dec 06 16:04:25 crc kubenswrapper[4848]: I1206 16:04:25.916021 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aceac04-0c82-4673-ad9d-5dd0cc04b99e-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 16:04:26 crc kubenswrapper[4848]: I1206 16:04:26.287576 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kl4qp" event={"ID":"7aceac04-0c82-4673-ad9d-5dd0cc04b99e","Type":"ContainerDied","Data":"de09f56c902d81da0b7b542c58a4ef06075c22ec0636012ba54134ea3d9637c2"} Dec 06 16:04:26 crc kubenswrapper[4848]: I1206 16:04:26.287642 4848 scope.go:117] "RemoveContainer" containerID="2bed8fccd80ce8ceb370b972a5252a2b147f0faae0e6788fbb3298d18ea2b389" Dec 06 16:04:26 crc kubenswrapper[4848]: I1206 16:04:26.287656 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kl4qp" Dec 06 16:04:26 crc kubenswrapper[4848]: I1206 16:04:26.316997 4848 scope.go:117] "RemoveContainer" containerID="418ce74fc58b6e79f262a79a50e2e8f477da469c3cc3379cf707eb090d7f3ceb" Dec 06 16:04:26 crc kubenswrapper[4848]: I1206 16:04:26.331762 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kl4qp"] Dec 06 16:04:26 crc kubenswrapper[4848]: I1206 16:04:26.338181 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kl4qp"] Dec 06 16:04:26 crc kubenswrapper[4848]: I1206 16:04:26.355876 4848 scope.go:117] "RemoveContainer" containerID="d8d1d3e3041b2363716f85aa40501ea4e572b60e206b643a3654ebbf5f71fdcd" Dec 06 16:04:26 crc kubenswrapper[4848]: I1206 16:04:26.977995 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" path="/var/lib/kubelet/pods/7aceac04-0c82-4673-ad9d-5dd0cc04b99e/volumes" Dec 06 16:04:28 crc kubenswrapper[4848]: I1206 16:04:28.071434 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:28 crc kubenswrapper[4848]: I1206 16:04:28.071965 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:28 crc kubenswrapper[4848]: I1206 16:04:28.133623 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:28 crc kubenswrapper[4848]: I1206 16:04:28.348016 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:29 crc kubenswrapper[4848]: I1206 16:04:29.294040 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzchp"] Dec 06 16:04:30 crc kubenswrapper[4848]: I1206 16:04:30.318279 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qzchp" podUID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" containerName="registry-server" containerID="cri-o://e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f" gracePeriod=2 Dec 06 16:04:30 crc kubenswrapper[4848]: I1206 16:04:30.795105 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:30 crc kubenswrapper[4848]: I1206 16:04:30.923819 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-catalog-content\") pod \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " Dec 06 16:04:30 crc kubenswrapper[4848]: I1206 16:04:30.923970 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr9c8\" (UniqueName: \"kubernetes.io/projected/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-kube-api-access-pr9c8\") pod \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " Dec 06 16:04:30 crc kubenswrapper[4848]: I1206 16:04:30.924065 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-utilities\") pod \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\" (UID: \"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0\") " Dec 06 16:04:30 crc kubenswrapper[4848]: I1206 16:04:30.924866 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-utilities" (OuterVolumeSpecName: "utilities") pod "42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" (UID: "42ae1027-67d0-4ae2-be4c-fbfcb5f243e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:04:30 crc kubenswrapper[4848]: I1206 16:04:30.929063 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-kube-api-access-pr9c8" (OuterVolumeSpecName: "kube-api-access-pr9c8") pod "42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" (UID: "42ae1027-67d0-4ae2-be4c-fbfcb5f243e0"). InnerVolumeSpecName "kube-api-access-pr9c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:04:30 crc kubenswrapper[4848]: I1206 16:04:30.942611 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" (UID: "42ae1027-67d0-4ae2-be4c-fbfcb5f243e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.025926 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.025961 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.025975 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr9c8\" (UniqueName: \"kubernetes.io/projected/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0-kube-api-access-pr9c8\") on node \"crc\" DevicePath \"\"" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.328876 4848 generic.go:334] "Generic (PLEG): container finished" podID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" containerID="e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f" exitCode=0 Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.328930 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzchp" event={"ID":"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0","Type":"ContainerDied","Data":"e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f"} Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.328956 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzchp" event={"ID":"42ae1027-67d0-4ae2-be4c-fbfcb5f243e0","Type":"ContainerDied","Data":"c3c124b2b6d20a9cd559fd8e7a94dfa3b0f23d01c80295f8f103d8ba4d6a1939"} Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.328971 4848 scope.go:117] "RemoveContainer" containerID="e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.329014 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzchp" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.350162 4848 scope.go:117] "RemoveContainer" containerID="f0532b7d8caa63562aa887966d491e6317108bfdf474203d9e3016de6bd44253" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.358747 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzchp"] Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.367040 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzchp"] Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.367086 4848 scope.go:117] "RemoveContainer" containerID="b1b22c1b55db098ab2b812ef085e353669cf2799ca43d71d187ae17bce78c574" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.407822 4848 scope.go:117] "RemoveContainer" containerID="e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f" Dec 06 16:04:31 crc kubenswrapper[4848]: E1206 16:04:31.408347 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f\": container with ID starting with e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f not found: ID does not exist" containerID="e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.408394 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f"} err="failed to get container status \"e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f\": rpc error: code = NotFound desc = could not find container \"e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f\": container with ID starting with e17829949c2c54d4daf69b7e2b39fc547c1067c9b8325e47474783f4b0e3479f not found: ID does not exist" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.408420 4848 scope.go:117] "RemoveContainer" containerID="f0532b7d8caa63562aa887966d491e6317108bfdf474203d9e3016de6bd44253" Dec 06 16:04:31 crc kubenswrapper[4848]: E1206 16:04:31.408719 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0532b7d8caa63562aa887966d491e6317108bfdf474203d9e3016de6bd44253\": container with ID starting with f0532b7d8caa63562aa887966d491e6317108bfdf474203d9e3016de6bd44253 not found: ID does not exist" containerID="f0532b7d8caa63562aa887966d491e6317108bfdf474203d9e3016de6bd44253" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.408762 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0532b7d8caa63562aa887966d491e6317108bfdf474203d9e3016de6bd44253"} err="failed to get container status \"f0532b7d8caa63562aa887966d491e6317108bfdf474203d9e3016de6bd44253\": rpc error: code = NotFound desc = could not find container \"f0532b7d8caa63562aa887966d491e6317108bfdf474203d9e3016de6bd44253\": container with ID starting with f0532b7d8caa63562aa887966d491e6317108bfdf474203d9e3016de6bd44253 not found: ID does not exist" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.408794 4848 scope.go:117] "RemoveContainer" containerID="b1b22c1b55db098ab2b812ef085e353669cf2799ca43d71d187ae17bce78c574" Dec 06 16:04:31 crc kubenswrapper[4848]: E1206 16:04:31.409095 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b22c1b55db098ab2b812ef085e353669cf2799ca43d71d187ae17bce78c574\": container with ID starting with b1b22c1b55db098ab2b812ef085e353669cf2799ca43d71d187ae17bce78c574 not found: ID does not exist" containerID="b1b22c1b55db098ab2b812ef085e353669cf2799ca43d71d187ae17bce78c574" Dec 06 16:04:31 crc kubenswrapper[4848]: I1206 16:04:31.409128 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b22c1b55db098ab2b812ef085e353669cf2799ca43d71d187ae17bce78c574"} err="failed to get container status \"b1b22c1b55db098ab2b812ef085e353669cf2799ca43d71d187ae17bce78c574\": rpc error: code = NotFound desc = could not find container \"b1b22c1b55db098ab2b812ef085e353669cf2799ca43d71d187ae17bce78c574\": container with ID starting with b1b22c1b55db098ab2b812ef085e353669cf2799ca43d71d187ae17bce78c574 not found: ID does not exist" Dec 06 16:04:32 crc kubenswrapper[4848]: I1206 16:04:32.864767 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qlx96/must-gather-7jnlk"] Dec 06 16:04:32 crc kubenswrapper[4848]: I1206 16:04:32.865398 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qlx96/must-gather-7jnlk" podUID="71bd67b7-42c5-4384-8431-07f05f3ae0a1" containerName="copy" containerID="cri-o://2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a" gracePeriod=2 Dec 06 16:04:32 crc kubenswrapper[4848]: I1206 16:04:32.879606 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qlx96/must-gather-7jnlk"] Dec 06 16:04:32 crc kubenswrapper[4848]: I1206 16:04:32.980370 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" path="/var/lib/kubelet/pods/42ae1027-67d0-4ae2-be4c-fbfcb5f243e0/volumes" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.300122 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qlx96_must-gather-7jnlk_71bd67b7-42c5-4384-8431-07f05f3ae0a1/copy/0.log" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.300528 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/must-gather-7jnlk" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.362588 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qlx96_must-gather-7jnlk_71bd67b7-42c5-4384-8431-07f05f3ae0a1/copy/0.log" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.363102 4848 generic.go:334] "Generic (PLEG): container finished" podID="71bd67b7-42c5-4384-8431-07f05f3ae0a1" containerID="2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a" exitCode=143 Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.363255 4848 scope.go:117] "RemoveContainer" containerID="2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.363228 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qlx96/must-gather-7jnlk" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.385499 4848 scope.go:117] "RemoveContainer" containerID="ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.394543 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71bd67b7-42c5-4384-8431-07f05f3ae0a1-must-gather-output\") pod \"71bd67b7-42c5-4384-8431-07f05f3ae0a1\" (UID: \"71bd67b7-42c5-4384-8431-07f05f3ae0a1\") " Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.394620 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dq6j\" (UniqueName: \"kubernetes.io/projected/71bd67b7-42c5-4384-8431-07f05f3ae0a1-kube-api-access-7dq6j\") pod \"71bd67b7-42c5-4384-8431-07f05f3ae0a1\" (UID: \"71bd67b7-42c5-4384-8431-07f05f3ae0a1\") " Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.400437 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bd67b7-42c5-4384-8431-07f05f3ae0a1-kube-api-access-7dq6j" (OuterVolumeSpecName: "kube-api-access-7dq6j") pod "71bd67b7-42c5-4384-8431-07f05f3ae0a1" (UID: "71bd67b7-42c5-4384-8431-07f05f3ae0a1"). InnerVolumeSpecName "kube-api-access-7dq6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.470732 4848 scope.go:117] "RemoveContainer" containerID="2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a" Dec 06 16:04:33 crc kubenswrapper[4848]: E1206 16:04:33.471795 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a\": container with ID starting with 2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a not found: ID does not exist" containerID="2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.472715 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a"} err="failed to get container status \"2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a\": rpc error: code = NotFound desc = could not find container \"2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a\": container with ID starting with 2883d7d6adbad29b35ce15b408d075b579d4b64e737b7f7afe8a17c7d0016f2a not found: ID does not exist" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.472821 4848 scope.go:117] "RemoveContainer" containerID="ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60" Dec 06 16:04:33 crc kubenswrapper[4848]: E1206 16:04:33.473299 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60\": container with ID starting with ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60 not found: ID does not exist" containerID="ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.473335 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60"} err="failed to get container status \"ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60\": rpc error: code = NotFound desc = could not find container \"ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60\": container with ID starting with ff21a243b8b62792b57c06b41ec7fd9d57596443bce1e29965425a594dabfa60 not found: ID does not exist" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.498929 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dq6j\" (UniqueName: \"kubernetes.io/projected/71bd67b7-42c5-4384-8431-07f05f3ae0a1-kube-api-access-7dq6j\") on node \"crc\" DevicePath \"\"" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.550652 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71bd67b7-42c5-4384-8431-07f05f3ae0a1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "71bd67b7-42c5-4384-8431-07f05f3ae0a1" (UID: "71bd67b7-42c5-4384-8431-07f05f3ae0a1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:04:33 crc kubenswrapper[4848]: I1206 16:04:33.601202 4848 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71bd67b7-42c5-4384-8431-07f05f3ae0a1-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 06 16:04:34 crc kubenswrapper[4848]: I1206 16:04:34.977440 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71bd67b7-42c5-4384-8431-07f05f3ae0a1" path="/var/lib/kubelet/pods/71bd67b7-42c5-4384-8431-07f05f3ae0a1/volumes" Dec 06 16:04:47 crc kubenswrapper[4848]: I1206 16:04:47.150297 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 16:04:47 crc kubenswrapper[4848]: I1206 16:04:47.151077 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 16:05:17 crc kubenswrapper[4848]: I1206 16:05:17.150514 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 16:05:17 crc kubenswrapper[4848]: I1206 16:05:17.151074 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 16:05:17 crc kubenswrapper[4848]: I1206 16:05:17.151118 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 16:05:17 crc kubenswrapper[4848]: I1206 16:05:17.151664 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5"} pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 16:05:17 crc kubenswrapper[4848]: I1206 16:05:17.151731 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" containerID="cri-o://7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" gracePeriod=600 Dec 06 16:05:17 crc kubenswrapper[4848]: E1206 16:05:17.463831 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:05:17 crc kubenswrapper[4848]: I1206 16:05:17.725042 4848 generic.go:334] "Generic (PLEG): container finished" podID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" exitCode=0 Dec 06 16:05:17 crc kubenswrapper[4848]: I1206 16:05:17.725090 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerDied","Data":"7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5"} Dec 06 16:05:17 crc kubenswrapper[4848]: I1206 16:05:17.725133 4848 scope.go:117] "RemoveContainer" containerID="2e074eab4c8a7163650caab1136ef46934cfa12f99d31416b70294a37f45c54d" Dec 06 16:05:17 crc kubenswrapper[4848]: I1206 16:05:17.725797 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:05:17 crc kubenswrapper[4848]: E1206 16:05:17.726051 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:05:30 crc kubenswrapper[4848]: I1206 16:05:30.967180 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:05:30 crc kubenswrapper[4848]: E1206 16:05:30.968948 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:05:42 crc kubenswrapper[4848]: I1206 16:05:42.972226 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:05:42 crc kubenswrapper[4848]: E1206 16:05:42.972996 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:05:57 crc kubenswrapper[4848]: I1206 16:05:57.966625 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:05:57 crc kubenswrapper[4848]: E1206 16:05:57.967559 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:06:10 crc kubenswrapper[4848]: I1206 16:06:10.966489 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:06:10 crc kubenswrapper[4848]: E1206 16:06:10.967424 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:06:21 crc kubenswrapper[4848]: I1206 16:06:21.966802 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:06:21 crc kubenswrapper[4848]: E1206 16:06:21.967421 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:06:23 crc kubenswrapper[4848]: I1206 16:06:23.636878 4848 scope.go:117] "RemoveContainer" containerID="c8f9d6826b794120f097112e43636b444dc32c3c59e4b55bb7188d52d22fc057" Dec 06 16:06:36 crc kubenswrapper[4848]: I1206 16:06:36.967361 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:06:36 crc kubenswrapper[4848]: E1206 16:06:36.968330 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:06:49 crc kubenswrapper[4848]: I1206 16:06:49.966058 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:06:49 crc kubenswrapper[4848]: E1206 16:06:49.966897 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:07:02 crc kubenswrapper[4848]: I1206 16:07:02.972209 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:07:02 crc kubenswrapper[4848]: E1206 16:07:02.972951 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:07:16 crc kubenswrapper[4848]: I1206 16:07:16.966721 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:07:16 crc kubenswrapper[4848]: E1206 16:07:16.967586 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.886921 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qqgmr/must-gather-cvwr5"] Dec 06 16:07:23 crc kubenswrapper[4848]: E1206 16:07:23.887878 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bd67b7-42c5-4384-8431-07f05f3ae0a1" containerName="gather" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.887896 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bd67b7-42c5-4384-8431-07f05f3ae0a1" containerName="gather" Dec 06 16:07:23 crc kubenswrapper[4848]: E1206 16:07:23.887908 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bd67b7-42c5-4384-8431-07f05f3ae0a1" containerName="copy" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.887914 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bd67b7-42c5-4384-8431-07f05f3ae0a1" containerName="copy" Dec 06 16:07:23 crc kubenswrapper[4848]: E1206 16:07:23.887933 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" containerName="extract-content" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.887940 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" containerName="extract-content" Dec 06 16:07:23 crc kubenswrapper[4848]: E1206 16:07:23.887949 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" containerName="extract-utilities" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.887954 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" containerName="extract-utilities" Dec 06 16:07:23 crc kubenswrapper[4848]: E1206 16:07:23.887969 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" containerName="registry-server" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.887975 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" containerName="registry-server" Dec 06 16:07:23 crc kubenswrapper[4848]: E1206 16:07:23.887987 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" containerName="extract-content" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.887993 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" containerName="extract-content" Dec 06 16:07:23 crc kubenswrapper[4848]: E1206 16:07:23.888003 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" containerName="registry-server" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.888008 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" containerName="registry-server" Dec 06 16:07:23 crc kubenswrapper[4848]: E1206 16:07:23.888017 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" containerName="extract-utilities" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.888024 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" containerName="extract-utilities" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.888206 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bd67b7-42c5-4384-8431-07f05f3ae0a1" containerName="copy" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.888220 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bd67b7-42c5-4384-8431-07f05f3ae0a1" containerName="gather" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.888227 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aceac04-0c82-4673-ad9d-5dd0cc04b99e" containerName="registry-server" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.888238 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ae1027-67d0-4ae2-be4c-fbfcb5f243e0" containerName="registry-server" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.889579 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/must-gather-cvwr5" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.892512 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qqgmr"/"default-dockercfg-c85v8" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.893107 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qqgmr"/"kube-root-ca.crt" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.894067 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qqgmr"/"openshift-service-ca.crt" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.959886 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qqgmr/must-gather-cvwr5"] Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.960186 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/039b5292-5897-4e75-bf58-a0d79f12445c-must-gather-output\") pod \"must-gather-cvwr5\" (UID: \"039b5292-5897-4e75-bf58-a0d79f12445c\") " pod="openshift-must-gather-qqgmr/must-gather-cvwr5" Dec 06 16:07:23 crc kubenswrapper[4848]: I1206 16:07:23.960363 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pj9t\" (UniqueName: \"kubernetes.io/projected/039b5292-5897-4e75-bf58-a0d79f12445c-kube-api-access-4pj9t\") pod \"must-gather-cvwr5\" (UID: \"039b5292-5897-4e75-bf58-a0d79f12445c\") " pod="openshift-must-gather-qqgmr/must-gather-cvwr5" Dec 06 16:07:24 crc kubenswrapper[4848]: I1206 16:07:24.062743 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pj9t\" (UniqueName: \"kubernetes.io/projected/039b5292-5897-4e75-bf58-a0d79f12445c-kube-api-access-4pj9t\") pod \"must-gather-cvwr5\" (UID: \"039b5292-5897-4e75-bf58-a0d79f12445c\") " pod="openshift-must-gather-qqgmr/must-gather-cvwr5" Dec 06 16:07:24 crc kubenswrapper[4848]: I1206 16:07:24.063236 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/039b5292-5897-4e75-bf58-a0d79f12445c-must-gather-output\") pod \"must-gather-cvwr5\" (UID: \"039b5292-5897-4e75-bf58-a0d79f12445c\") " pod="openshift-must-gather-qqgmr/must-gather-cvwr5" Dec 06 16:07:24 crc kubenswrapper[4848]: I1206 16:07:24.064142 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/039b5292-5897-4e75-bf58-a0d79f12445c-must-gather-output\") pod \"must-gather-cvwr5\" (UID: \"039b5292-5897-4e75-bf58-a0d79f12445c\") " pod="openshift-must-gather-qqgmr/must-gather-cvwr5" Dec 06 16:07:24 crc kubenswrapper[4848]: I1206 16:07:24.093597 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pj9t\" (UniqueName: \"kubernetes.io/projected/039b5292-5897-4e75-bf58-a0d79f12445c-kube-api-access-4pj9t\") pod \"must-gather-cvwr5\" (UID: \"039b5292-5897-4e75-bf58-a0d79f12445c\") " pod="openshift-must-gather-qqgmr/must-gather-cvwr5" Dec 06 16:07:24 crc kubenswrapper[4848]: I1206 16:07:24.208364 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/must-gather-cvwr5" Dec 06 16:07:24 crc kubenswrapper[4848]: I1206 16:07:24.659931 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qqgmr/must-gather-cvwr5"] Dec 06 16:07:24 crc kubenswrapper[4848]: W1206 16:07:24.668112 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod039b5292_5897_4e75_bf58_a0d79f12445c.slice/crio-8e85ae4f3a3357034d25a8cf7a42b68d298232b607d353b994e1b98c9750ec97 WatchSource:0}: Error finding container 8e85ae4f3a3357034d25a8cf7a42b68d298232b607d353b994e1b98c9750ec97: Status 404 returned error can't find the container with id 8e85ae4f3a3357034d25a8cf7a42b68d298232b607d353b994e1b98c9750ec97 Dec 06 16:07:24 crc kubenswrapper[4848]: I1206 16:07:24.844998 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqgmr/must-gather-cvwr5" event={"ID":"039b5292-5897-4e75-bf58-a0d79f12445c","Type":"ContainerStarted","Data":"8e85ae4f3a3357034d25a8cf7a42b68d298232b607d353b994e1b98c9750ec97"} Dec 06 16:07:25 crc kubenswrapper[4848]: I1206 16:07:25.854036 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqgmr/must-gather-cvwr5" event={"ID":"039b5292-5897-4e75-bf58-a0d79f12445c","Type":"ContainerStarted","Data":"7b80f0c264bc6f2f03d9cfafe6f2074d66335384acaa4ef41d7914a5d37e1666"} Dec 06 16:07:25 crc kubenswrapper[4848]: I1206 16:07:25.854295 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqgmr/must-gather-cvwr5" event={"ID":"039b5292-5897-4e75-bf58-a0d79f12445c","Type":"ContainerStarted","Data":"a21ad75bdec154ce0373555c6532ebc44c3885ad93db4a7bb6e9dde045cada01"} Dec 06 16:07:25 crc kubenswrapper[4848]: I1206 16:07:25.872848 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qqgmr/must-gather-cvwr5" podStartSLOduration=2.872827323 podStartE2EDuration="2.872827323s" podCreationTimestamp="2025-12-06 16:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 16:07:25.867101398 +0000 UTC m=+2313.165112311" watchObservedRunningTime="2025-12-06 16:07:25.872827323 +0000 UTC m=+2313.170838236" Dec 06 16:07:27 crc kubenswrapper[4848]: E1206 16:07:27.419426 4848 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.64:45232->38.102.83.64:35723: write tcp 38.102.83.64:45232->38.102.83.64:35723: write: broken pipe Dec 06 16:07:28 crc kubenswrapper[4848]: I1206 16:07:28.662995 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qqgmr/crc-debug-b7dm6"] Dec 06 16:07:28 crc kubenswrapper[4848]: I1206 16:07:28.664463 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" Dec 06 16:07:28 crc kubenswrapper[4848]: I1206 16:07:28.851547 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m58v\" (UniqueName: \"kubernetes.io/projected/a75988e1-3f6d-4943-924e-10991b46de69-kube-api-access-4m58v\") pod \"crc-debug-b7dm6\" (UID: \"a75988e1-3f6d-4943-924e-10991b46de69\") " pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" Dec 06 16:07:28 crc kubenswrapper[4848]: I1206 16:07:28.851689 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a75988e1-3f6d-4943-924e-10991b46de69-host\") pod \"crc-debug-b7dm6\" (UID: \"a75988e1-3f6d-4943-924e-10991b46de69\") " pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" Dec 06 16:07:28 crc kubenswrapper[4848]: I1206 16:07:28.954254 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a75988e1-3f6d-4943-924e-10991b46de69-host\") pod \"crc-debug-b7dm6\" (UID: \"a75988e1-3f6d-4943-924e-10991b46de69\") " pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" Dec 06 16:07:28 crc kubenswrapper[4848]: I1206 16:07:28.954360 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m58v\" (UniqueName: \"kubernetes.io/projected/a75988e1-3f6d-4943-924e-10991b46de69-kube-api-access-4m58v\") pod \"crc-debug-b7dm6\" (UID: \"a75988e1-3f6d-4943-924e-10991b46de69\") " pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" Dec 06 16:07:28 crc kubenswrapper[4848]: I1206 16:07:28.954423 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a75988e1-3f6d-4943-924e-10991b46de69-host\") pod \"crc-debug-b7dm6\" (UID: \"a75988e1-3f6d-4943-924e-10991b46de69\") " pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" Dec 06 16:07:28 crc kubenswrapper[4848]: I1206 16:07:28.967618 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:07:28 crc kubenswrapper[4848]: E1206 16:07:28.967923 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:07:28 crc kubenswrapper[4848]: I1206 16:07:28.974465 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m58v\" (UniqueName: \"kubernetes.io/projected/a75988e1-3f6d-4943-924e-10991b46de69-kube-api-access-4m58v\") pod \"crc-debug-b7dm6\" (UID: \"a75988e1-3f6d-4943-924e-10991b46de69\") " pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" Dec 06 16:07:28 crc kubenswrapper[4848]: I1206 16:07:28.986217 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" Dec 06 16:07:29 crc kubenswrapper[4848]: W1206 16:07:29.016076 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75988e1_3f6d_4943_924e_10991b46de69.slice/crio-68e46b7f6a49994daf1eff9048835403ddef14865992d225ffcb096e55a60a19 WatchSource:0}: Error finding container 68e46b7f6a49994daf1eff9048835403ddef14865992d225ffcb096e55a60a19: Status 404 returned error can't find the container with id 68e46b7f6a49994daf1eff9048835403ddef14865992d225ffcb096e55a60a19 Dec 06 16:07:29 crc kubenswrapper[4848]: I1206 16:07:29.890896 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" event={"ID":"a75988e1-3f6d-4943-924e-10991b46de69","Type":"ContainerStarted","Data":"c8527d89de7a356e55e6a1c58af918dad91f8f75c0f4f6b939d360948dda374c"} Dec 06 16:07:29 crc kubenswrapper[4848]: I1206 16:07:29.891455 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" event={"ID":"a75988e1-3f6d-4943-924e-10991b46de69","Type":"ContainerStarted","Data":"68e46b7f6a49994daf1eff9048835403ddef14865992d225ffcb096e55a60a19"} Dec 06 16:07:29 crc kubenswrapper[4848]: I1206 16:07:29.913161 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" podStartSLOduration=1.9131361949999999 podStartE2EDuration="1.913136195s" podCreationTimestamp="2025-12-06 16:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 16:07:29.903398451 +0000 UTC m=+2317.201409374" watchObservedRunningTime="2025-12-06 16:07:29.913136195 +0000 UTC m=+2317.211147108" Dec 06 16:07:39 crc kubenswrapper[4848]: I1206 16:07:39.967238 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:07:39 crc kubenswrapper[4848]: E1206 16:07:39.968066 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:07:52 crc kubenswrapper[4848]: I1206 16:07:52.995427 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:07:52 crc kubenswrapper[4848]: E1206 16:07:52.998635 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:08:01 crc kubenswrapper[4848]: I1206 16:08:01.135910 4848 generic.go:334] "Generic (PLEG): container finished" podID="a75988e1-3f6d-4943-924e-10991b46de69" containerID="c8527d89de7a356e55e6a1c58af918dad91f8f75c0f4f6b939d360948dda374c" exitCode=0 Dec 06 16:08:01 crc kubenswrapper[4848]: I1206 16:08:01.136016 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" event={"ID":"a75988e1-3f6d-4943-924e-10991b46de69","Type":"ContainerDied","Data":"c8527d89de7a356e55e6a1c58af918dad91f8f75c0f4f6b939d360948dda374c"} Dec 06 16:08:02 crc kubenswrapper[4848]: I1206 16:08:02.272304 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" Dec 06 16:08:02 crc kubenswrapper[4848]: I1206 16:08:02.305393 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qqgmr/crc-debug-b7dm6"] Dec 06 16:08:02 crc kubenswrapper[4848]: I1206 16:08:02.313138 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qqgmr/crc-debug-b7dm6"] Dec 06 16:08:02 crc kubenswrapper[4848]: I1206 16:08:02.403732 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m58v\" (UniqueName: \"kubernetes.io/projected/a75988e1-3f6d-4943-924e-10991b46de69-kube-api-access-4m58v\") pod \"a75988e1-3f6d-4943-924e-10991b46de69\" (UID: \"a75988e1-3f6d-4943-924e-10991b46de69\") " Dec 06 16:08:02 crc kubenswrapper[4848]: I1206 16:08:02.403893 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a75988e1-3f6d-4943-924e-10991b46de69-host\") pod \"a75988e1-3f6d-4943-924e-10991b46de69\" (UID: \"a75988e1-3f6d-4943-924e-10991b46de69\") " Dec 06 16:08:02 crc kubenswrapper[4848]: I1206 16:08:02.403932 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a75988e1-3f6d-4943-924e-10991b46de69-host" (OuterVolumeSpecName: "host") pod "a75988e1-3f6d-4943-924e-10991b46de69" (UID: "a75988e1-3f6d-4943-924e-10991b46de69"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 16:08:02 crc kubenswrapper[4848]: I1206 16:08:02.404235 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a75988e1-3f6d-4943-924e-10991b46de69-host\") on node \"crc\" DevicePath \"\"" Dec 06 16:08:02 crc kubenswrapper[4848]: I1206 16:08:02.409406 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75988e1-3f6d-4943-924e-10991b46de69-kube-api-access-4m58v" (OuterVolumeSpecName: "kube-api-access-4m58v") pod "a75988e1-3f6d-4943-924e-10991b46de69" (UID: "a75988e1-3f6d-4943-924e-10991b46de69"). InnerVolumeSpecName "kube-api-access-4m58v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:08:02 crc kubenswrapper[4848]: I1206 16:08:02.505661 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m58v\" (UniqueName: \"kubernetes.io/projected/a75988e1-3f6d-4943-924e-10991b46de69-kube-api-access-4m58v\") on node \"crc\" DevicePath \"\"" Dec 06 16:08:02 crc kubenswrapper[4848]: I1206 16:08:02.975862 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75988e1-3f6d-4943-924e-10991b46de69" path="/var/lib/kubelet/pods/a75988e1-3f6d-4943-924e-10991b46de69/volumes" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.167192 4848 scope.go:117] "RemoveContainer" containerID="c8527d89de7a356e55e6a1c58af918dad91f8f75c0f4f6b939d360948dda374c" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.167352 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-b7dm6" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.458309 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qqgmr/crc-debug-2q724"] Dec 06 16:08:03 crc kubenswrapper[4848]: E1206 16:08:03.459025 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75988e1-3f6d-4943-924e-10991b46de69" containerName="container-00" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.459040 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75988e1-3f6d-4943-924e-10991b46de69" containerName="container-00" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.459317 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75988e1-3f6d-4943-924e-10991b46de69" containerName="container-00" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.460096 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-2q724" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.626879 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwb2b\" (UniqueName: \"kubernetes.io/projected/16818204-8709-4ae8-9671-ab84180dfe4f-kube-api-access-dwb2b\") pod \"crc-debug-2q724\" (UID: \"16818204-8709-4ae8-9671-ab84180dfe4f\") " pod="openshift-must-gather-qqgmr/crc-debug-2q724" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.626965 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16818204-8709-4ae8-9671-ab84180dfe4f-host\") pod \"crc-debug-2q724\" (UID: \"16818204-8709-4ae8-9671-ab84180dfe4f\") " pod="openshift-must-gather-qqgmr/crc-debug-2q724" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.728444 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwb2b\" (UniqueName: \"kubernetes.io/projected/16818204-8709-4ae8-9671-ab84180dfe4f-kube-api-access-dwb2b\") pod \"crc-debug-2q724\" (UID: \"16818204-8709-4ae8-9671-ab84180dfe4f\") " pod="openshift-must-gather-qqgmr/crc-debug-2q724" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.728483 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16818204-8709-4ae8-9671-ab84180dfe4f-host\") pod \"crc-debug-2q724\" (UID: \"16818204-8709-4ae8-9671-ab84180dfe4f\") " pod="openshift-must-gather-qqgmr/crc-debug-2q724" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.728643 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16818204-8709-4ae8-9671-ab84180dfe4f-host\") pod \"crc-debug-2q724\" (UID: \"16818204-8709-4ae8-9671-ab84180dfe4f\") " pod="openshift-must-gather-qqgmr/crc-debug-2q724" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.745923 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwb2b\" (UniqueName: \"kubernetes.io/projected/16818204-8709-4ae8-9671-ab84180dfe4f-kube-api-access-dwb2b\") pod \"crc-debug-2q724\" (UID: \"16818204-8709-4ae8-9671-ab84180dfe4f\") " pod="openshift-must-gather-qqgmr/crc-debug-2q724" Dec 06 16:08:03 crc kubenswrapper[4848]: I1206 16:08:03.776128 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-2q724" Dec 06 16:08:03 crc kubenswrapper[4848]: W1206 16:08:03.804438 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16818204_8709_4ae8_9671_ab84180dfe4f.slice/crio-71e57174d2ea097ed9170b8b0be6ff3d88add792e73568ccadd5395ee2f3ff51 WatchSource:0}: Error finding container 71e57174d2ea097ed9170b8b0be6ff3d88add792e73568ccadd5395ee2f3ff51: Status 404 returned error can't find the container with id 71e57174d2ea097ed9170b8b0be6ff3d88add792e73568ccadd5395ee2f3ff51 Dec 06 16:08:04 crc kubenswrapper[4848]: I1206 16:08:04.178491 4848 generic.go:334] "Generic (PLEG): container finished" podID="16818204-8709-4ae8-9671-ab84180dfe4f" containerID="4fa22206e69ec2db5da74fe8503097dd07d7c990a6198cadd5b32d5e56ae338a" exitCode=0 Dec 06 16:08:04 crc kubenswrapper[4848]: I1206 16:08:04.178940 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqgmr/crc-debug-2q724" event={"ID":"16818204-8709-4ae8-9671-ab84180dfe4f","Type":"ContainerDied","Data":"4fa22206e69ec2db5da74fe8503097dd07d7c990a6198cadd5b32d5e56ae338a"} Dec 06 16:08:04 crc kubenswrapper[4848]: I1206 16:08:04.178966 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqgmr/crc-debug-2q724" event={"ID":"16818204-8709-4ae8-9671-ab84180dfe4f","Type":"ContainerStarted","Data":"71e57174d2ea097ed9170b8b0be6ff3d88add792e73568ccadd5395ee2f3ff51"} Dec 06 16:08:04 crc kubenswrapper[4848]: I1206 16:08:04.511404 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qqgmr/crc-debug-2q724"] Dec 06 16:08:04 crc kubenswrapper[4848]: I1206 16:08:04.523663 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qqgmr/crc-debug-2q724"] Dec 06 16:08:04 crc kubenswrapper[4848]: I1206 16:08:04.968146 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:08:04 crc kubenswrapper[4848]: E1206 16:08:04.968456 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.293403 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-2q724" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.459258 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16818204-8709-4ae8-9671-ab84180dfe4f-host\") pod \"16818204-8709-4ae8-9671-ab84180dfe4f\" (UID: \"16818204-8709-4ae8-9671-ab84180dfe4f\") " Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.459372 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16818204-8709-4ae8-9671-ab84180dfe4f-host" (OuterVolumeSpecName: "host") pod "16818204-8709-4ae8-9671-ab84180dfe4f" (UID: "16818204-8709-4ae8-9671-ab84180dfe4f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.459510 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwb2b\" (UniqueName: \"kubernetes.io/projected/16818204-8709-4ae8-9671-ab84180dfe4f-kube-api-access-dwb2b\") pod \"16818204-8709-4ae8-9671-ab84180dfe4f\" (UID: \"16818204-8709-4ae8-9671-ab84180dfe4f\") " Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.459992 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16818204-8709-4ae8-9671-ab84180dfe4f-host\") on node \"crc\" DevicePath \"\"" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.469894 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16818204-8709-4ae8-9671-ab84180dfe4f-kube-api-access-dwb2b" (OuterVolumeSpecName: "kube-api-access-dwb2b") pod "16818204-8709-4ae8-9671-ab84180dfe4f" (UID: "16818204-8709-4ae8-9671-ab84180dfe4f"). InnerVolumeSpecName "kube-api-access-dwb2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.562399 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwb2b\" (UniqueName: \"kubernetes.io/projected/16818204-8709-4ae8-9671-ab84180dfe4f-kube-api-access-dwb2b\") on node \"crc\" DevicePath \"\"" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.756164 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qqgmr/crc-debug-4mkz7"] Dec 06 16:08:05 crc kubenswrapper[4848]: E1206 16:08:05.756626 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16818204-8709-4ae8-9671-ab84180dfe4f" containerName="container-00" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.756648 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="16818204-8709-4ae8-9671-ab84180dfe4f" containerName="container-00" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.756900 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="16818204-8709-4ae8-9671-ab84180dfe4f" containerName="container-00" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.757690 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.781600 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444be314-98da-44b1-9ae2-f624eaea6087-host\") pod \"crc-debug-4mkz7\" (UID: \"444be314-98da-44b1-9ae2-f624eaea6087\") " pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.781767 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5kc\" (UniqueName: \"kubernetes.io/projected/444be314-98da-44b1-9ae2-f624eaea6087-kube-api-access-6n5kc\") pod \"crc-debug-4mkz7\" (UID: \"444be314-98da-44b1-9ae2-f624eaea6087\") " pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.883270 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444be314-98da-44b1-9ae2-f624eaea6087-host\") pod \"crc-debug-4mkz7\" (UID: \"444be314-98da-44b1-9ae2-f624eaea6087\") " pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.883368 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5kc\" (UniqueName: \"kubernetes.io/projected/444be314-98da-44b1-9ae2-f624eaea6087-kube-api-access-6n5kc\") pod \"crc-debug-4mkz7\" (UID: \"444be314-98da-44b1-9ae2-f624eaea6087\") " pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.883814 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444be314-98da-44b1-9ae2-f624eaea6087-host\") pod \"crc-debug-4mkz7\" (UID: \"444be314-98da-44b1-9ae2-f624eaea6087\") " pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" Dec 06 16:08:05 crc kubenswrapper[4848]: I1206 16:08:05.908837 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5kc\" (UniqueName: \"kubernetes.io/projected/444be314-98da-44b1-9ae2-f624eaea6087-kube-api-access-6n5kc\") pod \"crc-debug-4mkz7\" (UID: \"444be314-98da-44b1-9ae2-f624eaea6087\") " pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" Dec 06 16:08:06 crc kubenswrapper[4848]: I1206 16:08:06.098747 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" Dec 06 16:08:06 crc kubenswrapper[4848]: I1206 16:08:06.218586 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" event={"ID":"444be314-98da-44b1-9ae2-f624eaea6087","Type":"ContainerStarted","Data":"e1fc5b26a981b3aff52190cedd75dfa0379d2868f0d8c6fc8d89a36a4f172f69"} Dec 06 16:08:06 crc kubenswrapper[4848]: I1206 16:08:06.220585 4848 scope.go:117] "RemoveContainer" containerID="4fa22206e69ec2db5da74fe8503097dd07d7c990a6198cadd5b32d5e56ae338a" Dec 06 16:08:06 crc kubenswrapper[4848]: I1206 16:08:06.220753 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-2q724" Dec 06 16:08:07 crc kubenswrapper[4848]: I1206 16:08:07.003170 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16818204-8709-4ae8-9671-ab84180dfe4f" path="/var/lib/kubelet/pods/16818204-8709-4ae8-9671-ab84180dfe4f/volumes" Dec 06 16:08:07 crc kubenswrapper[4848]: I1206 16:08:07.228940 4848 generic.go:334] "Generic (PLEG): container finished" podID="444be314-98da-44b1-9ae2-f624eaea6087" containerID="c01ed560a36b8dec23d37c12778d0586c14ebd711f1f8b76fb8a415ae17c7939" exitCode=0 Dec 06 16:08:07 crc kubenswrapper[4848]: I1206 16:08:07.229007 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" event={"ID":"444be314-98da-44b1-9ae2-f624eaea6087","Type":"ContainerDied","Data":"c01ed560a36b8dec23d37c12778d0586c14ebd711f1f8b76fb8a415ae17c7939"} Dec 06 16:08:07 crc kubenswrapper[4848]: I1206 16:08:07.265293 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qqgmr/crc-debug-4mkz7"] Dec 06 16:08:07 crc kubenswrapper[4848]: I1206 16:08:07.272882 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qqgmr/crc-debug-4mkz7"] Dec 06 16:08:08 crc kubenswrapper[4848]: I1206 16:08:08.353886 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" Dec 06 16:08:08 crc kubenswrapper[4848]: I1206 16:08:08.429996 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444be314-98da-44b1-9ae2-f624eaea6087-host\") pod \"444be314-98da-44b1-9ae2-f624eaea6087\" (UID: \"444be314-98da-44b1-9ae2-f624eaea6087\") " Dec 06 16:08:08 crc kubenswrapper[4848]: I1206 16:08:08.430089 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n5kc\" (UniqueName: \"kubernetes.io/projected/444be314-98da-44b1-9ae2-f624eaea6087-kube-api-access-6n5kc\") pod \"444be314-98da-44b1-9ae2-f624eaea6087\" (UID: \"444be314-98da-44b1-9ae2-f624eaea6087\") " Dec 06 16:08:08 crc kubenswrapper[4848]: I1206 16:08:08.430092 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/444be314-98da-44b1-9ae2-f624eaea6087-host" (OuterVolumeSpecName: "host") pod "444be314-98da-44b1-9ae2-f624eaea6087" (UID: "444be314-98da-44b1-9ae2-f624eaea6087"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 16:08:08 crc kubenswrapper[4848]: I1206 16:08:08.430691 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444be314-98da-44b1-9ae2-f624eaea6087-host\") on node \"crc\" DevicePath \"\"" Dec 06 16:08:08 crc kubenswrapper[4848]: I1206 16:08:08.435441 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444be314-98da-44b1-9ae2-f624eaea6087-kube-api-access-6n5kc" (OuterVolumeSpecName: "kube-api-access-6n5kc") pod "444be314-98da-44b1-9ae2-f624eaea6087" (UID: "444be314-98da-44b1-9ae2-f624eaea6087"). InnerVolumeSpecName "kube-api-access-6n5kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:08:08 crc kubenswrapper[4848]: I1206 16:08:08.532445 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n5kc\" (UniqueName: \"kubernetes.io/projected/444be314-98da-44b1-9ae2-f624eaea6087-kube-api-access-6n5kc\") on node \"crc\" DevicePath \"\"" Dec 06 16:08:08 crc kubenswrapper[4848]: I1206 16:08:08.981231 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="444be314-98da-44b1-9ae2-f624eaea6087" path="/var/lib/kubelet/pods/444be314-98da-44b1-9ae2-f624eaea6087/volumes" Dec 06 16:08:09 crc kubenswrapper[4848]: I1206 16:08:09.247300 4848 scope.go:117] "RemoveContainer" containerID="c01ed560a36b8dec23d37c12778d0586c14ebd711f1f8b76fb8a415ae17c7939" Dec 06 16:08:09 crc kubenswrapper[4848]: I1206 16:08:09.247352 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/crc-debug-4mkz7" Dec 06 16:08:16 crc kubenswrapper[4848]: I1206 16:08:16.967164 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:08:16 crc kubenswrapper[4848]: E1206 16:08:16.967762 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.761293 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c7m77"] Dec 06 16:08:19 crc kubenswrapper[4848]: E1206 16:08:19.762232 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444be314-98da-44b1-9ae2-f624eaea6087" containerName="container-00" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.762248 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="444be314-98da-44b1-9ae2-f624eaea6087" containerName="container-00" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.762449 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="444be314-98da-44b1-9ae2-f624eaea6087" containerName="container-00" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.763928 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.773379 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c7m77"] Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.863987 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-catalog-content\") pod \"redhat-operators-c7m77\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.864092 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wld72\" (UniqueName: \"kubernetes.io/projected/118d783a-2dad-4aef-83fc-db9d1734154a-kube-api-access-wld72\") pod \"redhat-operators-c7m77\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.864212 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-utilities\") pod \"redhat-operators-c7m77\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.965954 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-catalog-content\") pod \"redhat-operators-c7m77\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.966030 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wld72\" (UniqueName: \"kubernetes.io/projected/118d783a-2dad-4aef-83fc-db9d1734154a-kube-api-access-wld72\") pod \"redhat-operators-c7m77\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.966152 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-utilities\") pod \"redhat-operators-c7m77\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.966498 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-catalog-content\") pod \"redhat-operators-c7m77\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.966613 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-utilities\") pod \"redhat-operators-c7m77\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:19 crc kubenswrapper[4848]: I1206 16:08:19.986636 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wld72\" (UniqueName: \"kubernetes.io/projected/118d783a-2dad-4aef-83fc-db9d1734154a-kube-api-access-wld72\") pod \"redhat-operators-c7m77\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:20 crc kubenswrapper[4848]: I1206 16:08:20.083576 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:20 crc kubenswrapper[4848]: I1206 16:08:20.607573 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c7m77"] Dec 06 16:08:21 crc kubenswrapper[4848]: I1206 16:08:21.515956 4848 generic.go:334] "Generic (PLEG): container finished" podID="118d783a-2dad-4aef-83fc-db9d1734154a" containerID="80a12d58587b1e839198338364821b836f10e579b342a99309f0757b2bc294fa" exitCode=0 Dec 06 16:08:21 crc kubenswrapper[4848]: I1206 16:08:21.516057 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7m77" event={"ID":"118d783a-2dad-4aef-83fc-db9d1734154a","Type":"ContainerDied","Data":"80a12d58587b1e839198338364821b836f10e579b342a99309f0757b2bc294fa"} Dec 06 16:08:21 crc kubenswrapper[4848]: I1206 16:08:21.516486 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7m77" event={"ID":"118d783a-2dad-4aef-83fc-db9d1734154a","Type":"ContainerStarted","Data":"2b6964efdfc04f73e299cf2e68ab2a5adf87068490e71dd4ec5f5c3b04ed0451"} Dec 06 16:08:22 crc kubenswrapper[4848]: I1206 16:08:22.525244 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7m77" event={"ID":"118d783a-2dad-4aef-83fc-db9d1734154a","Type":"ContainerStarted","Data":"b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563"} Dec 06 16:08:23 crc kubenswrapper[4848]: I1206 16:08:23.535788 4848 generic.go:334] "Generic (PLEG): container finished" podID="118d783a-2dad-4aef-83fc-db9d1734154a" containerID="b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563" exitCode=0 Dec 06 16:08:23 crc kubenswrapper[4848]: I1206 16:08:23.536033 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7m77" event={"ID":"118d783a-2dad-4aef-83fc-db9d1734154a","Type":"ContainerDied","Data":"b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563"} Dec 06 16:08:24 crc kubenswrapper[4848]: I1206 16:08:24.546256 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7m77" event={"ID":"118d783a-2dad-4aef-83fc-db9d1734154a","Type":"ContainerStarted","Data":"da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600"} Dec 06 16:08:24 crc kubenswrapper[4848]: I1206 16:08:24.569356 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c7m77" podStartSLOduration=3.179724887 podStartE2EDuration="5.56933537s" podCreationTimestamp="2025-12-06 16:08:19 +0000 UTC" firstStartedPulling="2025-12-06 16:08:21.517484087 +0000 UTC m=+2368.815495000" lastFinishedPulling="2025-12-06 16:08:23.90709457 +0000 UTC m=+2371.205105483" observedRunningTime="2025-12-06 16:08:24.567007316 +0000 UTC m=+2371.865018229" watchObservedRunningTime="2025-12-06 16:08:24.56933537 +0000 UTC m=+2371.867346293" Dec 06 16:08:27 crc kubenswrapper[4848]: I1206 16:08:27.966309 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:08:27 crc kubenswrapper[4848]: E1206 16:08:27.966830 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:08:29 crc kubenswrapper[4848]: I1206 16:08:29.143274 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5668cb4c58-xxwrf_3ae8c267-b7dd-4336-bedb-11c1e1bae7c3/barbican-api/0.log" Dec 06 16:08:29 crc kubenswrapper[4848]: I1206 16:08:29.345052 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5668cb4c58-xxwrf_3ae8c267-b7dd-4336-bedb-11c1e1bae7c3/barbican-api-log/0.log" Dec 06 16:08:29 crc kubenswrapper[4848]: I1206 16:08:29.469689 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-84d7d7d8f8-jgnhx_318d0309-cf5f-4bfe-8c93-c72f13ce4a24/barbican-keystone-listener/0.log" Dec 06 16:08:29 crc kubenswrapper[4848]: I1206 16:08:29.552599 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-84d7d7d8f8-jgnhx_318d0309-cf5f-4bfe-8c93-c72f13ce4a24/barbican-keystone-listener-log/0.log" Dec 06 16:08:29 crc kubenswrapper[4848]: I1206 16:08:29.665808 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69c9c94d7-cmv75_e66ac417-22af-4413-afdd-d3b8006a5eb8/barbican-worker-log/0.log" Dec 06 16:08:29 crc kubenswrapper[4848]: I1206 16:08:29.687631 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69c9c94d7-cmv75_e66ac417-22af-4413-afdd-d3b8006a5eb8/barbican-worker/0.log" Dec 06 16:08:29 crc kubenswrapper[4848]: I1206 16:08:29.820448 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6a0084de-2a42-4cd7-a5ce-67c1770870b2/ceilometer-central-agent/0.log" Dec 06 16:08:29 crc kubenswrapper[4848]: I1206 16:08:29.912044 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6a0084de-2a42-4cd7-a5ce-67c1770870b2/ceilometer-notification-agent/0.log" Dec 06 16:08:29 crc kubenswrapper[4848]: I1206 16:08:29.930608 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6a0084de-2a42-4cd7-a5ce-67c1770870b2/proxy-httpd/0.log" Dec 06 16:08:30 crc kubenswrapper[4848]: I1206 16:08:30.012937 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6a0084de-2a42-4cd7-a5ce-67c1770870b2/sg-core/0.log" Dec 06 16:08:30 crc kubenswrapper[4848]: I1206 16:08:30.084280 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:30 crc kubenswrapper[4848]: I1206 16:08:30.084357 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:30 crc kubenswrapper[4848]: I1206 16:08:30.139944 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:30 crc kubenswrapper[4848]: I1206 16:08:30.410155 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d5c9c312-22cc-49cf-b342-247cfd7b1906/probe/0.log" Dec 06 16:08:30 crc kubenswrapper[4848]: I1206 16:08:30.656257 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:30 crc kubenswrapper[4848]: I1206 16:08:30.717355 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c7m77"] Dec 06 16:08:30 crc kubenswrapper[4848]: I1206 16:08:30.937999 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c7b6c5df9-qpndw_0fbe788b-df4c-456d-a2d3-b64abbf62ac7/init/0.log" Dec 06 16:08:31 crc kubenswrapper[4848]: I1206 16:08:31.846942 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_61e60c86-1fae-4b73-9c2c-bb5bdd108630/cinder-api-log/0.log" Dec 06 16:08:31 crc kubenswrapper[4848]: I1206 16:08:31.884629 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_61e60c86-1fae-4b73-9c2c-bb5bdd108630/cinder-api/0.log" Dec 06 16:08:31 crc kubenswrapper[4848]: I1206 16:08:31.907863 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d5c9c312-22cc-49cf-b342-247cfd7b1906/cinder-scheduler/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.038408 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c7b6c5df9-qpndw_0fbe788b-df4c-456d-a2d3-b64abbf62ac7/init/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.089237 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c7b6c5df9-qpndw_0fbe788b-df4c-456d-a2d3-b64abbf62ac7/dnsmasq-dns/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.097438 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9f14ea76-e339-4e47-9063-898de1d2fac8/glance-httpd/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.242561 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9f14ea76-e339-4e47-9063-898de1d2fac8/glance-log/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.301483 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_08ab771a-21e5-4145-8954-8ac8c039a8c4/glance-httpd/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.371045 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_08ab771a-21e5-4145-8954-8ac8c039a8c4/glance-log/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.491338 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-59575bb9d8-57gb5_abe62341-68ac-438b-8aa5-4b0067c8c9ea/init/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.620400 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c7m77" podUID="118d783a-2dad-4aef-83fc-db9d1734154a" containerName="registry-server" containerID="cri-o://da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600" gracePeriod=2 Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.648627 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-59575bb9d8-57gb5_abe62341-68ac-438b-8aa5-4b0067c8c9ea/init/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.712753 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-59575bb9d8-57gb5_abe62341-68ac-438b-8aa5-4b0067c8c9ea/ironic-api/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.739689 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/init/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.743556 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-59575bb9d8-57gb5_abe62341-68ac-438b-8aa5-4b0067c8c9ea/ironic-api-log/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.955996 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/init/0.log" Dec 06 16:08:32 crc kubenswrapper[4848]: I1206 16:08:32.975470 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ironic-python-agent-init/0.log" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.050001 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ironic-python-agent-init/0.log" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.204177 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ironic-python-agent-init/0.log" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.254593 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/init/0.log" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.475793 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ironic-python-agent-init/0.log" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.482628 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/init/0.log" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.629536 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.635583 4848 generic.go:334] "Generic (PLEG): container finished" podID="118d783a-2dad-4aef-83fc-db9d1734154a" containerID="da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600" exitCode=0 Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.635626 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7m77" event={"ID":"118d783a-2dad-4aef-83fc-db9d1734154a","Type":"ContainerDied","Data":"da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600"} Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.635653 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7m77" event={"ID":"118d783a-2dad-4aef-83fc-db9d1734154a","Type":"ContainerDied","Data":"2b6964efdfc04f73e299cf2e68ab2a5adf87068490e71dd4ec5f5c3b04ed0451"} Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.635672 4848 scope.go:117] "RemoveContainer" containerID="da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.635829 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7m77" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.669646 4848 scope.go:117] "RemoveContainer" containerID="b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.684350 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/httpboot/0.log" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.699317 4848 scope.go:117] "RemoveContainer" containerID="80a12d58587b1e839198338364821b836f10e579b342a99309f0757b2bc294fa" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.759888 4848 scope.go:117] "RemoveContainer" containerID="da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600" Dec 06 16:08:33 crc kubenswrapper[4848]: E1206 16:08:33.760536 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600\": container with ID starting with da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600 not found: ID does not exist" containerID="da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.760572 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600"} err="failed to get container status \"da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600\": rpc error: code = NotFound desc = could not find container \"da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600\": container with ID starting with da74a73765041caa6a4d45f417749a76f67d8bb869f9a50455a0284bdf176600 not found: ID does not exist" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.760593 4848 scope.go:117] "RemoveContainer" containerID="b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563" Dec 06 16:08:33 crc kubenswrapper[4848]: E1206 16:08:33.760899 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563\": container with ID starting with b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563 not found: ID does not exist" containerID="b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.760925 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563"} err="failed to get container status \"b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563\": rpc error: code = NotFound desc = could not find container \"b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563\": container with ID starting with b252561c9138ebcd141f568fdf62b0c96f5b2b744b5e8eff4d8e7f1580971563 not found: ID does not exist" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.760937 4848 scope.go:117] "RemoveContainer" containerID="80a12d58587b1e839198338364821b836f10e579b342a99309f0757b2bc294fa" Dec 06 16:08:33 crc kubenswrapper[4848]: E1206 16:08:33.761133 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80a12d58587b1e839198338364821b836f10e579b342a99309f0757b2bc294fa\": container with ID starting with 80a12d58587b1e839198338364821b836f10e579b342a99309f0757b2bc294fa not found: ID does not exist" containerID="80a12d58587b1e839198338364821b836f10e579b342a99309f0757b2bc294fa" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.761154 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80a12d58587b1e839198338364821b836f10e579b342a99309f0757b2bc294fa"} err="failed to get container status \"80a12d58587b1e839198338364821b836f10e579b342a99309f0757b2bc294fa\": rpc error: code = NotFound desc = could not find container \"80a12d58587b1e839198338364821b836f10e579b342a99309f0757b2bc294fa\": container with ID starting with 80a12d58587b1e839198338364821b836f10e579b342a99309f0757b2bc294fa not found: ID does not exist" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.779866 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wld72\" (UniqueName: \"kubernetes.io/projected/118d783a-2dad-4aef-83fc-db9d1734154a-kube-api-access-wld72\") pod \"118d783a-2dad-4aef-83fc-db9d1734154a\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.779994 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-utilities\") pod \"118d783a-2dad-4aef-83fc-db9d1734154a\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.780099 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-catalog-content\") pod \"118d783a-2dad-4aef-83fc-db9d1734154a\" (UID: \"118d783a-2dad-4aef-83fc-db9d1734154a\") " Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.781551 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-utilities" (OuterVolumeSpecName: "utilities") pod "118d783a-2dad-4aef-83fc-db9d1734154a" (UID: "118d783a-2dad-4aef-83fc-db9d1734154a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.786278 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/118d783a-2dad-4aef-83fc-db9d1734154a-kube-api-access-wld72" (OuterVolumeSpecName: "kube-api-access-wld72") pod "118d783a-2dad-4aef-83fc-db9d1734154a" (UID: "118d783a-2dad-4aef-83fc-db9d1734154a"). InnerVolumeSpecName "kube-api-access-wld72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.881957 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wld72\" (UniqueName: \"kubernetes.io/projected/118d783a-2dad-4aef-83fc-db9d1734154a-kube-api-access-wld72\") on node \"crc\" DevicePath \"\"" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.881989 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.914165 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "118d783a-2dad-4aef-83fc-db9d1734154a" (UID: "118d783a-2dad-4aef-83fc-db9d1734154a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.942049 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ironic-conductor/0.log" Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.973496 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c7m77"] Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.982125 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c7m77"] Dec 06 16:08:33 crc kubenswrapper[4848]: I1206 16:08:33.984200 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118d783a-2dad-4aef-83fc-db9d1734154a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 16:08:34 crc kubenswrapper[4848]: I1206 16:08:34.127125 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/ramdisk-logs/0.log" Dec 06 16:08:34 crc kubenswrapper[4848]: I1206 16:08:34.460364 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/pxe-init/0.log" Dec 06 16:08:34 crc kubenswrapper[4848]: I1206 16:08:34.506390 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-7xkpk_a75f41ed-628b-4e88-8d67-ada299f1c7a9/init/0.log" Dec 06 16:08:34 crc kubenswrapper[4848]: I1206 16:08:34.740172 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-7xkpk_a75f41ed-628b-4e88-8d67-ada299f1c7a9/init/0.log" Dec 06 16:08:34 crc kubenswrapper[4848]: I1206 16:08:34.794228 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-7xkpk_a75f41ed-628b-4e88-8d67-ada299f1c7a9/ironic-db-sync/0.log" Dec 06 16:08:34 crc kubenswrapper[4848]: I1206 16:08:34.894442 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/pxe-init/0.log" Dec 06 16:08:34 crc kubenswrapper[4848]: I1206 16:08:34.960433 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ironic-python-agent-init/0.log" Dec 06 16:08:34 crc kubenswrapper[4848]: I1206 16:08:34.975237 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="118d783a-2dad-4aef-83fc-db9d1734154a" path="/var/lib/kubelet/pods/118d783a-2dad-4aef-83fc-db9d1734154a/volumes" Dec 06 16:08:34 crc kubenswrapper[4848]: I1206 16:08:34.983340 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/pxe-init/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.026512 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_ab198686-7839-4e39-abdb-ea9b65893a02/pxe-init/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.189896 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ironic-python-agent-init/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.202770 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/inspector-pxe-init/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.388850 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/inspector-pxe-init/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.535105 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ironic-python-agent-init/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.546653 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/inspector-httpboot/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.594160 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/inspector-pxe-init/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.619006 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ironic-inspector-httpd/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.659621 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ironic-inspector/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.729113 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3e8a616c-5de8-4037-86f3-1d4e891947f6/ramdisk-logs/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.877856 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-db-sync-6xvjg_fb815e0a-f0ff-40d4-b1c4-1220b71db056/ironic-inspector-db-sync/0.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.879552 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-5f6db98496-rh44f_692f44d3-ff17-419f-b16c-b37f71521603/ironic-neutron-agent/2.log" Dec 06 16:08:35 crc kubenswrapper[4848]: I1206 16:08:35.935218 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-5f6db98496-rh44f_692f44d3-ff17-419f-b16c-b37f71521603/ironic-neutron-agent/1.log" Dec 06 16:08:36 crc kubenswrapper[4848]: I1206 16:08:36.107381 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29417281-xvp8d_dbabea74-70c4-42a2-aa24-2f3976b6b616/keystone-cron/0.log" Dec 06 16:08:36 crc kubenswrapper[4848]: I1206 16:08:36.251504 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e9aa43e5-ee37-49dc-8278-f2018f524c42/kube-state-metrics/0.log" Dec 06 16:08:36 crc kubenswrapper[4848]: I1206 16:08:36.272132 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-84fbb4c9b8-bdccl_63d987d2-da9e-4cfb-b409-2b5c66f307f8/keystone-api/0.log" Dec 06 16:08:36 crc kubenswrapper[4848]: I1206 16:08:36.520255 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7d47d5c9-wf778_aead053e-0f4a-48bf-b446-9a1dbdc7e996/neutron-httpd/0.log" Dec 06 16:08:36 crc kubenswrapper[4848]: I1206 16:08:36.580246 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7d47d5c9-wf778_aead053e-0f4a-48bf-b446-9a1dbdc7e996/neutron-api/0.log" Dec 06 16:08:36 crc kubenswrapper[4848]: I1206 16:08:36.890414 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_310ce79c-5eaa-461b-b99c-9e4aee7849c4/nova-api-log/0.log" Dec 06 16:08:37 crc kubenswrapper[4848]: I1206 16:08:37.058036 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_310ce79c-5eaa-461b-b99c-9e4aee7849c4/nova-api-api/0.log" Dec 06 16:08:37 crc kubenswrapper[4848]: I1206 16:08:37.103877 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ea901ebd-9d73-4ba0-8448-634e2b3d17f7/nova-cell0-conductor-conductor/0.log" Dec 06 16:08:37 crc kubenswrapper[4848]: I1206 16:08:37.280414 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_40340eae-e441-4326-b678-265f2cd36d20/nova-cell1-conductor-conductor/0.log" Dec 06 16:08:37 crc kubenswrapper[4848]: I1206 16:08:37.456768 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_789dd8b9-2530-46a7-b5ee-7276afa689fb/nova-cell1-novncproxy-novncproxy/0.log" Dec 06 16:08:37 crc kubenswrapper[4848]: I1206 16:08:37.600848 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_daebc9f0-bb2e-4deb-aa48-00553c450e81/nova-metadata-log/0.log" Dec 06 16:08:37 crc kubenswrapper[4848]: I1206 16:08:37.838012 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bb6169da-9db4-4d22-bd22-aaf2322103df/mysql-bootstrap/0.log" Dec 06 16:08:37 crc kubenswrapper[4848]: I1206 16:08:37.859076 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4d6e9497-4228-4848-962e-e319a0c0fdf4/nova-scheduler-scheduler/0.log" Dec 06 16:08:38 crc kubenswrapper[4848]: I1206 16:08:38.062192 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bb6169da-9db4-4d22-bd22-aaf2322103df/galera/0.log" Dec 06 16:08:38 crc kubenswrapper[4848]: I1206 16:08:38.097993 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_daebc9f0-bb2e-4deb-aa48-00553c450e81/nova-metadata-metadata/0.log" Dec 06 16:08:38 crc kubenswrapper[4848]: I1206 16:08:38.122490 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bb6169da-9db4-4d22-bd22-aaf2322103df/mysql-bootstrap/0.log" Dec 06 16:08:38 crc kubenswrapper[4848]: I1206 16:08:38.280160 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a1c66a34-c907-4841-92b1-0799522b6bd5/mysql-bootstrap/0.log" Dec 06 16:08:38 crc kubenswrapper[4848]: I1206 16:08:38.524320 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a1c66a34-c907-4841-92b1-0799522b6bd5/galera/0.log" Dec 06 16:08:38 crc kubenswrapper[4848]: I1206 16:08:38.556091 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_28b28ed8-c6be-4256-8ccd-8c560959048b/openstackclient/0.log" Dec 06 16:08:38 crc kubenswrapper[4848]: I1206 16:08:38.567346 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a1c66a34-c907-4841-92b1-0799522b6bd5/mysql-bootstrap/0.log" Dec 06 16:08:38 crc kubenswrapper[4848]: I1206 16:08:38.752895 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-g6wzf_93c0a1e4-91cd-4801-8439-a41fb872135f/ovn-controller/0.log" Dec 06 16:08:38 crc kubenswrapper[4848]: I1206 16:08:38.803061 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fl4pk_28348623-0697-417e-8f17-de443d77348c/openstack-network-exporter/0.log" Dec 06 16:08:38 crc kubenswrapper[4848]: I1206 16:08:38.967894 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:08:38 crc kubenswrapper[4848]: E1206 16:08:38.968106 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:08:39 crc kubenswrapper[4848]: I1206 16:08:39.135606 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcx5h_978d5b2d-7113-4b4e-944a-2681e5da434d/ovsdb-server-init/0.log" Dec 06 16:08:39 crc kubenswrapper[4848]: I1206 16:08:39.291012 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcx5h_978d5b2d-7113-4b4e-944a-2681e5da434d/ovs-vswitchd/0.log" Dec 06 16:08:39 crc kubenswrapper[4848]: I1206 16:08:39.340837 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcx5h_978d5b2d-7113-4b4e-944a-2681e5da434d/ovsdb-server-init/0.log" Dec 06 16:08:39 crc kubenswrapper[4848]: I1206 16:08:39.377329 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcx5h_978d5b2d-7113-4b4e-944a-2681e5da434d/ovsdb-server/0.log" Dec 06 16:08:39 crc kubenswrapper[4848]: I1206 16:08:39.493881 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_04a22b86-df7e-4426-aa1c-3f8c21c02354/openstack-network-exporter/0.log" Dec 06 16:08:39 crc kubenswrapper[4848]: I1206 16:08:39.538849 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_04a22b86-df7e-4426-aa1c-3f8c21c02354/ovn-northd/0.log" Dec 06 16:08:39 crc kubenswrapper[4848]: I1206 16:08:39.669010 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2928825b-3e1c-48cd-827e-afad27fe84c1/openstack-network-exporter/0.log" Dec 06 16:08:39 crc kubenswrapper[4848]: I1206 16:08:39.767371 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2928825b-3e1c-48cd-827e-afad27fe84c1/ovsdbserver-nb/0.log" Dec 06 16:08:39 crc kubenswrapper[4848]: I1206 16:08:39.893358 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d723dc9-fd9d-4b78-9fa6-c18e8656f634/openstack-network-exporter/0.log" Dec 06 16:08:39 crc kubenswrapper[4848]: I1206 16:08:39.893914 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d723dc9-fd9d-4b78-9fa6-c18e8656f634/ovsdbserver-sb/0.log" Dec 06 16:08:40 crc kubenswrapper[4848]: I1206 16:08:40.057158 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-b7b97f6b4-d9lkl_96280da8-11f8-49be-81a6-3bdcd053463f/placement-api/0.log" Dec 06 16:08:40 crc kubenswrapper[4848]: I1206 16:08:40.203528 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7/setup-container/0.log" Dec 06 16:08:40 crc kubenswrapper[4848]: I1206 16:08:40.208447 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-b7b97f6b4-d9lkl_96280da8-11f8-49be-81a6-3bdcd053463f/placement-log/0.log" Dec 06 16:08:40 crc kubenswrapper[4848]: I1206 16:08:40.417972 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7/setup-container/0.log" Dec 06 16:08:40 crc kubenswrapper[4848]: I1206 16:08:40.444081 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0b4e03ac-e69e-46de-9dd1-5ea4cc56c0c7/rabbitmq/0.log" Dec 06 16:08:40 crc kubenswrapper[4848]: I1206 16:08:40.527341 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d58ead1c-d7f6-4643-a869-8566f5d9843b/setup-container/0.log" Dec 06 16:08:40 crc kubenswrapper[4848]: I1206 16:08:40.670264 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d58ead1c-d7f6-4643-a869-8566f5d9843b/setup-container/0.log" Dec 06 16:08:40 crc kubenswrapper[4848]: I1206 16:08:40.707715 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d58ead1c-d7f6-4643-a869-8566f5d9843b/rabbitmq/0.log" Dec 06 16:08:40 crc kubenswrapper[4848]: I1206 16:08:40.873525 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76858ffddc-pvnks_6c86a3f4-dccd-48e8-9169-a63eaaded209/proxy-httpd/0.log" Dec 06 16:08:40 crc kubenswrapper[4848]: I1206 16:08:40.929899 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76858ffddc-pvnks_6c86a3f4-dccd-48e8-9169-a63eaaded209/proxy-server/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.090165 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-99ss6_6e8f9caf-2ee3-4f6b-9512-c92b77bbe16d/swift-ring-rebalance/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.180435 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/account-reaper/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.306783 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/account-auditor/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.318567 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/account-replicator/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.364855 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/account-server/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.431913 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/container-auditor/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.486238 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/container-replicator/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.545974 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/container-server/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.593186 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/container-updater/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.629446 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/object-auditor/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.689619 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/object-expirer/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.754399 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/object-replicator/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.835896 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/object-server/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.860135 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/object-updater/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.926594 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/rsync/0.log" Dec 06 16:08:41 crc kubenswrapper[4848]: I1206 16:08:41.930575 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_14e2fe95-2aba-441c-85e1-ebd9bc0ba12f/swift-recon-cron/0.log" Dec 06 16:08:46 crc kubenswrapper[4848]: I1206 16:08:46.443021 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d62dd990-abf6-47e3-aafd-5e7efb0ab5c6/memcached/0.log" Dec 06 16:08:52 crc kubenswrapper[4848]: I1206 16:08:52.974098 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:08:52 crc kubenswrapper[4848]: E1206 16:08:52.986128 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.052107 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/util/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.201743 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/util/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.203872 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/pull/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.222061 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/pull/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.358619 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/pull/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.379239 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/util/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.400935 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_aa7a935cbf7167890f3ecde71c431f2969fe5a672e67ae80e526bab0ce8n5dg_0a2e359c-0d23-4b5f-a484-9d010361a7dd/extract/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.529049 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-69cx9_706ced85-8889-45e9-bd15-1a2747a9de2e/kube-rbac-proxy/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.613504 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-plnw5_9418eec1-6430-4bb9-a7be-6ec83f61c629/kube-rbac-proxy/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.629042 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-69cx9_706ced85-8889-45e9-bd15-1a2747a9de2e/manager/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.755080 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-plnw5_9418eec1-6430-4bb9-a7be-6ec83f61c629/manager/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.825064 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-rhsqm_1438e750-61e2-4c37-8d03-f22b8ebad123/manager/0.log" Dec 06 16:09:04 crc kubenswrapper[4848]: I1206 16:09:04.871068 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-rhsqm_1438e750-61e2-4c37-8d03-f22b8ebad123/kube-rbac-proxy/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.003537 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-dzsgv_d99ce981-7f71-4636-94c7-c848830429f3/kube-rbac-proxy/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.049638 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-dzsgv_d99ce981-7f71-4636-94c7-c848830429f3/manager/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.173394 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-6vq9l_fd64b532-b259-49e5-bd47-62d9d20b6a69/kube-rbac-proxy/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.253446 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-6vq9l_fd64b532-b259-49e5-bd47-62d9d20b6a69/manager/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.347518 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jtn8k_1b49aafe-c450-441c-ade7-0d87b868dc2a/kube-rbac-proxy/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.378762 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jtn8k_1b49aafe-c450-441c-ade7-0d87b868dc2a/manager/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.470226 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-4k55x_cf4f4d25-7fc2-411f-9e23-71171162f38a/kube-rbac-proxy/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.679291 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54476ccddc-74npj_eecbdda5-b888-4fb9-979d-66bb4d8ffcf4/kube-rbac-proxy/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.764457 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54476ccddc-74npj_eecbdda5-b888-4fb9-979d-66bb4d8ffcf4/manager/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.778947 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-4k55x_cf4f4d25-7fc2-411f-9e23-71171162f38a/manager/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.870414 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4gqwc_fee1c62f-115d-472d-8617-32a386cf06c2/kube-rbac-proxy/0.log" Dec 06 16:09:05 crc kubenswrapper[4848]: I1206 16:09:05.967000 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:09:05 crc kubenswrapper[4848]: E1206 16:09:05.967409 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.025102 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4gqwc_fee1c62f-115d-472d-8617-32a386cf06c2/manager/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.062985 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-4vwqn_76bfd2a6-6774-4992-91ec-c73327b11bd8/kube-rbac-proxy/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.095566 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-4vwqn_76bfd2a6-6774-4992-91ec-c73327b11bd8/manager/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.230148 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-9vx7t_95e14baa-5bbb-4bf6-9420-0428c25cc98f/kube-rbac-proxy/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.267045 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-9vx7t_95e14baa-5bbb-4bf6-9420-0428c25cc98f/manager/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.373569 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vpn6w_7c78ca28-a1c2-45b9-9a14-733aae9ee555/kube-rbac-proxy/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.454206 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-s6hbc_fa68cc1e-18ec-42d1-a3de-948ef2cc0804/kube-rbac-proxy/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.454715 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vpn6w_7c78ca28-a1c2-45b9-9a14-733aae9ee555/manager/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.624138 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-s6hbc_fa68cc1e-18ec-42d1-a3de-948ef2cc0804/manager/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.665901 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9zcn8_0d42fecd-1b6d-4f29-816c-38515c0a547c/kube-rbac-proxy/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.705360 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9zcn8_0d42fecd-1b6d-4f29-816c-38515c0a547c/manager/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.863442 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fgtgmh_ec6dd72c-05cb-49f7-af2a-01a76807175c/kube-rbac-proxy/0.log" Dec 06 16:09:06 crc kubenswrapper[4848]: I1206 16:09:06.881076 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fgtgmh_ec6dd72c-05cb-49f7-af2a-01a76807175c/manager/0.log" Dec 06 16:09:07 crc kubenswrapper[4848]: I1206 16:09:07.218026 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-865d7c46f-s58q4_e10571bc-0d25-47b1-bcd1-272c89db1fb6/operator/0.log" Dec 06 16:09:07 crc kubenswrapper[4848]: I1206 16:09:07.320096 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-62kg6_10365f81-1470-441b-9533-24f6a526fe55/registry-server/0.log" Dec 06 16:09:07 crc kubenswrapper[4848]: I1206 16:09:07.432277 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-x7d2b_dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd/kube-rbac-proxy/0.log" Dec 06 16:09:07 crc kubenswrapper[4848]: I1206 16:09:07.623670 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-x7d2b_dfd95140-dda4-46d6-af2d-1f5cbd9f4cbd/manager/0.log" Dec 06 16:09:07 crc kubenswrapper[4848]: I1206 16:09:07.677049 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bxhmf_90274bce-5d78-4f62-8265-2218bc58916d/kube-rbac-proxy/0.log" Dec 06 16:09:07 crc kubenswrapper[4848]: I1206 16:09:07.789092 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bxhmf_90274bce-5d78-4f62-8265-2218bc58916d/manager/0.log" Dec 06 16:09:07 crc kubenswrapper[4848]: I1206 16:09:07.870640 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pzg55_deacbe3a-30ba-42bb-a180-f8e2360ba937/operator/0.log" Dec 06 16:09:07 crc kubenswrapper[4848]: I1206 16:09:07.949910 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fcf4cdbd6-q2rkz_bcfbe560-d48b-48d7-bcf8-cbf4fd2dd826/manager/0.log" Dec 06 16:09:08 crc kubenswrapper[4848]: I1206 16:09:08.019360 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-5h8p9_d2b8bc9b-a359-4fe9-a039-200eed4f7218/kube-rbac-proxy/0.log" Dec 06 16:09:08 crc kubenswrapper[4848]: I1206 16:09:08.063152 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-5h8p9_d2b8bc9b-a359-4fe9-a039-200eed4f7218/manager/0.log" Dec 06 16:09:08 crc kubenswrapper[4848]: I1206 16:09:08.147690 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-cwth7_cf3406a1-eb72-4251-bbe3-45a33235ac96/kube-rbac-proxy/0.log" Dec 06 16:09:08 crc kubenswrapper[4848]: I1206 16:09:08.243847 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-cwth7_cf3406a1-eb72-4251-bbe3-45a33235ac96/manager/0.log" Dec 06 16:09:08 crc kubenswrapper[4848]: I1206 16:09:08.286726 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-x7tzd_f59ce483-133a-495a-862c-3676661adab8/kube-rbac-proxy/0.log" Dec 06 16:09:08 crc kubenswrapper[4848]: I1206 16:09:08.340623 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-x7tzd_f59ce483-133a-495a-862c-3676661adab8/manager/0.log" Dec 06 16:09:08 crc kubenswrapper[4848]: I1206 16:09:08.382984 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-s6ttq_93dc2914-4eae-4bd4-a4c3-94122e44f908/kube-rbac-proxy/0.log" Dec 06 16:09:08 crc kubenswrapper[4848]: I1206 16:09:08.457326 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-s6ttq_93dc2914-4eae-4bd4-a4c3-94122e44f908/manager/0.log" Dec 06 16:09:16 crc kubenswrapper[4848]: I1206 16:09:16.967212 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:09:16 crc kubenswrapper[4848]: E1206 16:09:16.967967 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:09:24 crc kubenswrapper[4848]: I1206 16:09:24.863637 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-6wjr7_204ca83b-b95a-451a-bc43-a46bf0f2859d/control-plane-machine-set-operator/0.log" Dec 06 16:09:25 crc kubenswrapper[4848]: I1206 16:09:25.053963 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7h7ps_4a6e4efa-abe1-44da-9528-73fe113e016a/kube-rbac-proxy/0.log" Dec 06 16:09:25 crc kubenswrapper[4848]: I1206 16:09:25.078911 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7h7ps_4a6e4efa-abe1-44da-9528-73fe113e016a/machine-api-operator/0.log" Dec 06 16:09:28 crc kubenswrapper[4848]: I1206 16:09:28.967055 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:09:28 crc kubenswrapper[4848]: E1206 16:09:28.967643 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:09:36 crc kubenswrapper[4848]: I1206 16:09:36.501539 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-q9bg5_be7d17e4-0e3a-4a56-9a2f-d8a7ab9b0960/cert-manager-controller/0.log" Dec 06 16:09:36 crc kubenswrapper[4848]: I1206 16:09:36.683784 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-zlwlz_96eace79-3285-4aef-902d-aed98f97663c/cert-manager-cainjector/0.log" Dec 06 16:09:36 crc kubenswrapper[4848]: I1206 16:09:36.708797 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-pwsfx_bd79a30b-a387-4c71-9415-5d8c8a20cd63/cert-manager-webhook/0.log" Dec 06 16:09:39 crc kubenswrapper[4848]: I1206 16:09:39.966327 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:09:39 crc kubenswrapper[4848]: E1206 16:09:39.967734 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:09:48 crc kubenswrapper[4848]: I1206 16:09:48.414313 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-f5s7t_ee3d85c7-86b8-4eb5-9832-65313a05b1a4/nmstate-console-plugin/0.log" Dec 06 16:09:48 crc kubenswrapper[4848]: I1206 16:09:48.578455 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-j5wcn_49d9817d-6554-407c-9842-c044929eb803/nmstate-handler/0.log" Dec 06 16:09:48 crc kubenswrapper[4848]: I1206 16:09:48.620090 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-nfpfl_b6ab59ca-8a07-452a-bc3a-b071fccdc3ff/nmstate-metrics/0.log" Dec 06 16:09:48 crc kubenswrapper[4848]: I1206 16:09:48.642687 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-nfpfl_b6ab59ca-8a07-452a-bc3a-b071fccdc3ff/kube-rbac-proxy/0.log" Dec 06 16:09:48 crc kubenswrapper[4848]: I1206 16:09:48.759074 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-5m7pn_3ad4635f-e66a-4dee-a97b-e2b94ae72319/nmstate-operator/0.log" Dec 06 16:09:48 crc kubenswrapper[4848]: I1206 16:09:48.845127 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-8hj7l_54fb82cb-0b0a-42e8-99dc-3df8b471bd89/nmstate-webhook/0.log" Dec 06 16:09:50 crc kubenswrapper[4848]: I1206 16:09:50.966748 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:09:50 crc kubenswrapper[4848]: E1206 16:09:50.967258 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:10:01 crc kubenswrapper[4848]: I1206 16:10:01.414684 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-9xn7s_53fefa8d-ef28-4a20-8e75-b633d64f4863/kube-rbac-proxy/0.log" Dec 06 16:10:01 crc kubenswrapper[4848]: I1206 16:10:01.553936 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-9xn7s_53fefa8d-ef28-4a20-8e75-b633d64f4863/controller/0.log" Dec 06 16:10:01 crc kubenswrapper[4848]: I1206 16:10:01.672533 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-frr-files/0.log" Dec 06 16:10:01 crc kubenswrapper[4848]: I1206 16:10:01.837151 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-metrics/0.log" Dec 06 16:10:01 crc kubenswrapper[4848]: I1206 16:10:01.838191 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-frr-files/0.log" Dec 06 16:10:01 crc kubenswrapper[4848]: I1206 16:10:01.860393 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-reloader/0.log" Dec 06 16:10:01 crc kubenswrapper[4848]: I1206 16:10:01.865394 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-reloader/0.log" Dec 06 16:10:01 crc kubenswrapper[4848]: I1206 16:10:01.966794 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:10:01 crc kubenswrapper[4848]: E1206 16:10:01.967052 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:10:01 crc kubenswrapper[4848]: I1206 16:10:01.981801 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-reloader/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.032456 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-metrics/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.037240 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-frr-files/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.068302 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-metrics/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.210731 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-reloader/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.233974 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-frr-files/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.235654 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/controller/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.265398 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/cp-metrics/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.428779 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/frr-metrics/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.446787 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/kube-rbac-proxy/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.504599 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/kube-rbac-proxy-frr/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.636813 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/reloader/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.720295 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-ptrrq_99f1be97-6216-4892-8da4-32dac60daaaa/frr-k8s-webhook-server/0.log" Dec 06 16:10:02 crc kubenswrapper[4848]: I1206 16:10:02.869944 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-d8c5d4748-tk9lq_b31b58c9-49c3-4e31-b838-5532169c319b/manager/0.log" Dec 06 16:10:03 crc kubenswrapper[4848]: I1206 16:10:03.072423 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57cbc9df7c-kltd9_d03f9739-5d90-45b0-a717-bc66f0522234/webhook-server/0.log" Dec 06 16:10:03 crc kubenswrapper[4848]: I1206 16:10:03.183177 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4b5lc_1649535c-6c66-412c-be24-f452edfe82a1/kube-rbac-proxy/0.log" Dec 06 16:10:03 crc kubenswrapper[4848]: I1206 16:10:03.440312 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pqzrf_ebaeee0b-28be-4402-8323-57dad52490a6/frr/0.log" Dec 06 16:10:03 crc kubenswrapper[4848]: I1206 16:10:03.640846 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4b5lc_1649535c-6c66-412c-be24-f452edfe82a1/speaker/0.log" Dec 06 16:10:12 crc kubenswrapper[4848]: I1206 16:10:12.973560 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:10:12 crc kubenswrapper[4848]: E1206 16:10:12.974857 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mrg5_openshift-machine-config-operator(fc8499a5-41f5-49e8-a206-3240532ec6a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" Dec 06 16:10:15 crc kubenswrapper[4848]: I1206 16:10:15.419872 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/util/0.log" Dec 06 16:10:15 crc kubenswrapper[4848]: I1206 16:10:15.550204 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/util/0.log" Dec 06 16:10:15 crc kubenswrapper[4848]: I1206 16:10:15.585793 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/pull/0.log" Dec 06 16:10:15 crc kubenswrapper[4848]: I1206 16:10:15.607124 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/pull/0.log" Dec 06 16:10:15 crc kubenswrapper[4848]: I1206 16:10:15.772066 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/extract/0.log" Dec 06 16:10:15 crc kubenswrapper[4848]: I1206 16:10:15.772911 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/pull/0.log" Dec 06 16:10:15 crc kubenswrapper[4848]: I1206 16:10:15.779754 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ft69pc_79b3772c-7550-4073-a2cc-42508125cb74/util/0.log" Dec 06 16:10:15 crc kubenswrapper[4848]: I1206 16:10:15.936919 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/util/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.103923 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/pull/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.105552 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/util/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.111409 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/pull/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.269974 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/pull/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.274079 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/util/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.313360 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83qwbsd_c523ae20-637c-439b-9869-98cf3ac3c8a0/extract/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.438984 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-utilities/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.561250 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-content/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.589921 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-content/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.592850 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-utilities/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.763447 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-utilities/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.769518 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/extract-content/0.log" Dec 06 16:10:16 crc kubenswrapper[4848]: I1206 16:10:16.942562 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-utilities/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.165954 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k889q_a35496c1-709b-4b72-8f68-c72b29e955ea/registry-server/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.190022 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-utilities/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.211995 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-content/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.247958 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-content/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.366280 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-content/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.381287 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/extract-utilities/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.575594 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zgk8b_6c0af646-ef02-4709-9e31-cd29cd07fa4a/marketplace-operator/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.706917 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-utilities/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.787456 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dvh52_5245db24-208d-47cf-9d64-d62b203292e2/registry-server/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.922636 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-content/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.960513 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-utilities/0.log" Dec 06 16:10:17 crc kubenswrapper[4848]: I1206 16:10:17.984132 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-content/0.log" Dec 06 16:10:18 crc kubenswrapper[4848]: I1206 16:10:18.132830 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-utilities/0.log" Dec 06 16:10:18 crc kubenswrapper[4848]: I1206 16:10:18.184964 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/extract-content/0.log" Dec 06 16:10:18 crc kubenswrapper[4848]: I1206 16:10:18.211857 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7xkb_11513b70-de83-43bd-a70f-fdcc3a6aa1da/registry-server/0.log" Dec 06 16:10:18 crc kubenswrapper[4848]: I1206 16:10:18.328211 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-utilities/0.log" Dec 06 16:10:18 crc kubenswrapper[4848]: I1206 16:10:18.497954 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-content/0.log" Dec 06 16:10:18 crc kubenswrapper[4848]: I1206 16:10:18.528227 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-content/0.log" Dec 06 16:10:18 crc kubenswrapper[4848]: I1206 16:10:18.544310 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-utilities/0.log" Dec 06 16:10:18 crc kubenswrapper[4848]: I1206 16:10:18.674919 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-content/0.log" Dec 06 16:10:18 crc kubenswrapper[4848]: I1206 16:10:18.687782 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/extract-utilities/0.log" Dec 06 16:10:19 crc kubenswrapper[4848]: I1206 16:10:19.051014 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-47f72_95984d31-57cc-4d15-b0da-208b3bba0cfc/registry-server/0.log" Dec 06 16:10:24 crc kubenswrapper[4848]: I1206 16:10:24.966770 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:10:25 crc kubenswrapper[4848]: I1206 16:10:25.976774 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"f49608ba999dc0c2dd6204fb2bae6ecee08284972706246863789e2ccffb956b"} Dec 06 16:11:52 crc kubenswrapper[4848]: I1206 16:11:52.794303 4848 generic.go:334] "Generic (PLEG): container finished" podID="039b5292-5897-4e75-bf58-a0d79f12445c" containerID="a21ad75bdec154ce0373555c6532ebc44c3885ad93db4a7bb6e9dde045cada01" exitCode=0 Dec 06 16:11:52 crc kubenswrapper[4848]: I1206 16:11:52.794787 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqgmr/must-gather-cvwr5" event={"ID":"039b5292-5897-4e75-bf58-a0d79f12445c","Type":"ContainerDied","Data":"a21ad75bdec154ce0373555c6532ebc44c3885ad93db4a7bb6e9dde045cada01"} Dec 06 16:11:52 crc kubenswrapper[4848]: I1206 16:11:52.795449 4848 scope.go:117] "RemoveContainer" containerID="a21ad75bdec154ce0373555c6532ebc44c3885ad93db4a7bb6e9dde045cada01" Dec 06 16:11:53 crc kubenswrapper[4848]: I1206 16:11:53.042650 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qqgmr_must-gather-cvwr5_039b5292-5897-4e75-bf58-a0d79f12445c/gather/0.log" Dec 06 16:12:03 crc kubenswrapper[4848]: I1206 16:12:03.546759 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qqgmr/must-gather-cvwr5"] Dec 06 16:12:03 crc kubenswrapper[4848]: I1206 16:12:03.547726 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qqgmr/must-gather-cvwr5" podUID="039b5292-5897-4e75-bf58-a0d79f12445c" containerName="copy" containerID="cri-o://7b80f0c264bc6f2f03d9cfafe6f2074d66335384acaa4ef41d7914a5d37e1666" gracePeriod=2 Dec 06 16:12:03 crc kubenswrapper[4848]: I1206 16:12:03.555642 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qqgmr/must-gather-cvwr5"] Dec 06 16:12:03 crc kubenswrapper[4848]: I1206 16:12:03.880958 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qqgmr_must-gather-cvwr5_039b5292-5897-4e75-bf58-a0d79f12445c/copy/0.log" Dec 06 16:12:03 crc kubenswrapper[4848]: I1206 16:12:03.881921 4848 generic.go:334] "Generic (PLEG): container finished" podID="039b5292-5897-4e75-bf58-a0d79f12445c" containerID="7b80f0c264bc6f2f03d9cfafe6f2074d66335384acaa4ef41d7914a5d37e1666" exitCode=143 Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.170590 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qqgmr_must-gather-cvwr5_039b5292-5897-4e75-bf58-a0d79f12445c/copy/0.log" Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.171011 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/must-gather-cvwr5" Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.262059 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pj9t\" (UniqueName: \"kubernetes.io/projected/039b5292-5897-4e75-bf58-a0d79f12445c-kube-api-access-4pj9t\") pod \"039b5292-5897-4e75-bf58-a0d79f12445c\" (UID: \"039b5292-5897-4e75-bf58-a0d79f12445c\") " Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.262439 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/039b5292-5897-4e75-bf58-a0d79f12445c-must-gather-output\") pod \"039b5292-5897-4e75-bf58-a0d79f12445c\" (UID: \"039b5292-5897-4e75-bf58-a0d79f12445c\") " Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.267586 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/039b5292-5897-4e75-bf58-a0d79f12445c-kube-api-access-4pj9t" (OuterVolumeSpecName: "kube-api-access-4pj9t") pod "039b5292-5897-4e75-bf58-a0d79f12445c" (UID: "039b5292-5897-4e75-bf58-a0d79f12445c"). InnerVolumeSpecName "kube-api-access-4pj9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.364591 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pj9t\" (UniqueName: \"kubernetes.io/projected/039b5292-5897-4e75-bf58-a0d79f12445c-kube-api-access-4pj9t\") on node \"crc\" DevicePath \"\"" Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.389011 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/039b5292-5897-4e75-bf58-a0d79f12445c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "039b5292-5897-4e75-bf58-a0d79f12445c" (UID: "039b5292-5897-4e75-bf58-a0d79f12445c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.466162 4848 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/039b5292-5897-4e75-bf58-a0d79f12445c-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.896521 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qqgmr_must-gather-cvwr5_039b5292-5897-4e75-bf58-a0d79f12445c/copy/0.log" Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.897062 4848 scope.go:117] "RemoveContainer" containerID="7b80f0c264bc6f2f03d9cfafe6f2074d66335384acaa4ef41d7914a5d37e1666" Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.897106 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqgmr/must-gather-cvwr5" Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.916153 4848 scope.go:117] "RemoveContainer" containerID="a21ad75bdec154ce0373555c6532ebc44c3885ad93db4a7bb6e9dde045cada01" Dec 06 16:12:04 crc kubenswrapper[4848]: I1206 16:12:04.980457 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="039b5292-5897-4e75-bf58-a0d79f12445c" path="/var/lib/kubelet/pods/039b5292-5897-4e75-bf58-a0d79f12445c/volumes" Dec 06 16:12:47 crc kubenswrapper[4848]: I1206 16:12:47.150066 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 16:12:47 crc kubenswrapper[4848]: I1206 16:12:47.150853 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 16:13:17 crc kubenswrapper[4848]: I1206 16:13:17.150158 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 16:13:17 crc kubenswrapper[4848]: I1206 16:13:17.150690 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.656387 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bkq6v"] Dec 06 16:13:34 crc kubenswrapper[4848]: E1206 16:13:34.657518 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118d783a-2dad-4aef-83fc-db9d1734154a" containerName="registry-server" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.657539 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="118d783a-2dad-4aef-83fc-db9d1734154a" containerName="registry-server" Dec 06 16:13:34 crc kubenswrapper[4848]: E1206 16:13:34.657568 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118d783a-2dad-4aef-83fc-db9d1734154a" containerName="extract-utilities" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.657578 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="118d783a-2dad-4aef-83fc-db9d1734154a" containerName="extract-utilities" Dec 06 16:13:34 crc kubenswrapper[4848]: E1206 16:13:34.657593 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039b5292-5897-4e75-bf58-a0d79f12445c" containerName="copy" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.657602 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="039b5292-5897-4e75-bf58-a0d79f12445c" containerName="copy" Dec 06 16:13:34 crc kubenswrapper[4848]: E1206 16:13:34.657625 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118d783a-2dad-4aef-83fc-db9d1734154a" containerName="extract-content" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.657635 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="118d783a-2dad-4aef-83fc-db9d1734154a" containerName="extract-content" Dec 06 16:13:34 crc kubenswrapper[4848]: E1206 16:13:34.657653 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039b5292-5897-4e75-bf58-a0d79f12445c" containerName="gather" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.657661 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="039b5292-5897-4e75-bf58-a0d79f12445c" containerName="gather" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.657905 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="039b5292-5897-4e75-bf58-a0d79f12445c" containerName="gather" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.657929 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="039b5292-5897-4e75-bf58-a0d79f12445c" containerName="copy" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.657947 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="118d783a-2dad-4aef-83fc-db9d1734154a" containerName="registry-server" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.659785 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.671564 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bkq6v"] Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.677619 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n25pg\" (UniqueName: \"kubernetes.io/projected/31655880-db61-4f54-9ad8-65b704224fa0-kube-api-access-n25pg\") pod \"community-operators-bkq6v\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.677946 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-utilities\") pod \"community-operators-bkq6v\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.678085 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-catalog-content\") pod \"community-operators-bkq6v\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.780283 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-utilities\") pod \"community-operators-bkq6v\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.780385 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-catalog-content\") pod \"community-operators-bkq6v\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.780565 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n25pg\" (UniqueName: \"kubernetes.io/projected/31655880-db61-4f54-9ad8-65b704224fa0-kube-api-access-n25pg\") pod \"community-operators-bkq6v\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.781127 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-utilities\") pod \"community-operators-bkq6v\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.781500 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-catalog-content\") pod \"community-operators-bkq6v\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.799292 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n25pg\" (UniqueName: \"kubernetes.io/projected/31655880-db61-4f54-9ad8-65b704224fa0-kube-api-access-n25pg\") pod \"community-operators-bkq6v\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:34 crc kubenswrapper[4848]: I1206 16:13:34.996249 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:35 crc kubenswrapper[4848]: I1206 16:13:35.515242 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bkq6v"] Dec 06 16:13:36 crc kubenswrapper[4848]: I1206 16:13:36.393786 4848 generic.go:334] "Generic (PLEG): container finished" podID="31655880-db61-4f54-9ad8-65b704224fa0" containerID="5506584f2fd38d46e3f4afd47790c000a4d773a5259f258cbb5ba1b9ca7db00d" exitCode=0 Dec 06 16:13:36 crc kubenswrapper[4848]: I1206 16:13:36.393894 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkq6v" event={"ID":"31655880-db61-4f54-9ad8-65b704224fa0","Type":"ContainerDied","Data":"5506584f2fd38d46e3f4afd47790c000a4d773a5259f258cbb5ba1b9ca7db00d"} Dec 06 16:13:36 crc kubenswrapper[4848]: I1206 16:13:36.394108 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkq6v" event={"ID":"31655880-db61-4f54-9ad8-65b704224fa0","Type":"ContainerStarted","Data":"f32aeabb3e1eba21e30d82cb8b564b457271f9482a93c8a1ea509e3ac06edf78"} Dec 06 16:13:36 crc kubenswrapper[4848]: I1206 16:13:36.396355 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 16:13:37 crc kubenswrapper[4848]: I1206 16:13:37.403654 4848 generic.go:334] "Generic (PLEG): container finished" podID="31655880-db61-4f54-9ad8-65b704224fa0" containerID="c167ac95cba7e72a3a14726013231f100e6fa60912b43ecd2bacdf46ae4d3a07" exitCode=0 Dec 06 16:13:37 crc kubenswrapper[4848]: I1206 16:13:37.403887 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkq6v" event={"ID":"31655880-db61-4f54-9ad8-65b704224fa0","Type":"ContainerDied","Data":"c167ac95cba7e72a3a14726013231f100e6fa60912b43ecd2bacdf46ae4d3a07"} Dec 06 16:13:38 crc kubenswrapper[4848]: I1206 16:13:38.424863 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkq6v" event={"ID":"31655880-db61-4f54-9ad8-65b704224fa0","Type":"ContainerStarted","Data":"19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e"} Dec 06 16:13:38 crc kubenswrapper[4848]: I1206 16:13:38.451901 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bkq6v" podStartSLOduration=3.038988542 podStartE2EDuration="4.451885047s" podCreationTimestamp="2025-12-06 16:13:34 +0000 UTC" firstStartedPulling="2025-12-06 16:13:36.396106947 +0000 UTC m=+2683.694117860" lastFinishedPulling="2025-12-06 16:13:37.809003452 +0000 UTC m=+2685.107014365" observedRunningTime="2025-12-06 16:13:38.449434221 +0000 UTC m=+2685.747445134" watchObservedRunningTime="2025-12-06 16:13:38.451885047 +0000 UTC m=+2685.749895960" Dec 06 16:13:44 crc kubenswrapper[4848]: I1206 16:13:44.997722 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:44 crc kubenswrapper[4848]: I1206 16:13:44.998268 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:45 crc kubenswrapper[4848]: I1206 16:13:45.060758 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:45 crc kubenswrapper[4848]: I1206 16:13:45.524259 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:45 crc kubenswrapper[4848]: I1206 16:13:45.574065 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bkq6v"] Dec 06 16:13:47 crc kubenswrapper[4848]: I1206 16:13:47.150679 4848 patch_prober.go:28] interesting pod/machine-config-daemon-7mrg5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 16:13:47 crc kubenswrapper[4848]: I1206 16:13:47.151009 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 16:13:47 crc kubenswrapper[4848]: I1206 16:13:47.151053 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" Dec 06 16:13:47 crc kubenswrapper[4848]: I1206 16:13:47.151800 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f49608ba999dc0c2dd6204fb2bae6ecee08284972706246863789e2ccffb956b"} pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 16:13:47 crc kubenswrapper[4848]: I1206 16:13:47.151872 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" podUID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerName="machine-config-daemon" containerID="cri-o://f49608ba999dc0c2dd6204fb2bae6ecee08284972706246863789e2ccffb956b" gracePeriod=600 Dec 06 16:13:47 crc kubenswrapper[4848]: I1206 16:13:47.495891 4848 generic.go:334] "Generic (PLEG): container finished" podID="fc8499a5-41f5-49e8-a206-3240532ec6a0" containerID="f49608ba999dc0c2dd6204fb2bae6ecee08284972706246863789e2ccffb956b" exitCode=0 Dec 06 16:13:47 crc kubenswrapper[4848]: I1206 16:13:47.495955 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerDied","Data":"f49608ba999dc0c2dd6204fb2bae6ecee08284972706246863789e2ccffb956b"} Dec 06 16:13:47 crc kubenswrapper[4848]: I1206 16:13:47.496232 4848 scope.go:117] "RemoveContainer" containerID="7145d47b5b0b351b80ba999adf47aa545363fb403ece7b45bad960b8242a8ca5" Dec 06 16:13:47 crc kubenswrapper[4848]: I1206 16:13:47.496378 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bkq6v" podUID="31655880-db61-4f54-9ad8-65b704224fa0" containerName="registry-server" containerID="cri-o://19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e" gracePeriod=2 Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.471495 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.511400 4848 generic.go:334] "Generic (PLEG): container finished" podID="31655880-db61-4f54-9ad8-65b704224fa0" containerID="19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e" exitCode=0 Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.511458 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkq6v" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.511467 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkq6v" event={"ID":"31655880-db61-4f54-9ad8-65b704224fa0","Type":"ContainerDied","Data":"19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e"} Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.511590 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkq6v" event={"ID":"31655880-db61-4f54-9ad8-65b704224fa0","Type":"ContainerDied","Data":"f32aeabb3e1eba21e30d82cb8b564b457271f9482a93c8a1ea509e3ac06edf78"} Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.511656 4848 scope.go:117] "RemoveContainer" containerID="19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.514715 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mrg5" event={"ID":"fc8499a5-41f5-49e8-a206-3240532ec6a0","Type":"ContainerStarted","Data":"e7d841f5890d5bc80af75682cf4b996caeb021c6a4d23b1c91ee6d7a4c0826b9"} Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.532424 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n25pg\" (UniqueName: \"kubernetes.io/projected/31655880-db61-4f54-9ad8-65b704224fa0-kube-api-access-n25pg\") pod \"31655880-db61-4f54-9ad8-65b704224fa0\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.532495 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-utilities\") pod \"31655880-db61-4f54-9ad8-65b704224fa0\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.532537 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-catalog-content\") pod \"31655880-db61-4f54-9ad8-65b704224fa0\" (UID: \"31655880-db61-4f54-9ad8-65b704224fa0\") " Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.534011 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-utilities" (OuterVolumeSpecName: "utilities") pod "31655880-db61-4f54-9ad8-65b704224fa0" (UID: "31655880-db61-4f54-9ad8-65b704224fa0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.537045 4848 scope.go:117] "RemoveContainer" containerID="c167ac95cba7e72a3a14726013231f100e6fa60912b43ecd2bacdf46ae4d3a07" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.540872 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31655880-db61-4f54-9ad8-65b704224fa0-kube-api-access-n25pg" (OuterVolumeSpecName: "kube-api-access-n25pg") pod "31655880-db61-4f54-9ad8-65b704224fa0" (UID: "31655880-db61-4f54-9ad8-65b704224fa0"). InnerVolumeSpecName "kube-api-access-n25pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.593020 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31655880-db61-4f54-9ad8-65b704224fa0" (UID: "31655880-db61-4f54-9ad8-65b704224fa0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.593321 4848 scope.go:117] "RemoveContainer" containerID="5506584f2fd38d46e3f4afd47790c000a4d773a5259f258cbb5ba1b9ca7db00d" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.631789 4848 scope.go:117] "RemoveContainer" containerID="19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e" Dec 06 16:13:48 crc kubenswrapper[4848]: E1206 16:13:48.632277 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e\": container with ID starting with 19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e not found: ID does not exist" containerID="19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.632320 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e"} err="failed to get container status \"19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e\": rpc error: code = NotFound desc = could not find container \"19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e\": container with ID starting with 19ed9ebbf91537bada4980563453e0104dc34d9eb8e666efb3d5200469e9d90e not found: ID does not exist" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.632343 4848 scope.go:117] "RemoveContainer" containerID="c167ac95cba7e72a3a14726013231f100e6fa60912b43ecd2bacdf46ae4d3a07" Dec 06 16:13:48 crc kubenswrapper[4848]: E1206 16:13:48.633458 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c167ac95cba7e72a3a14726013231f100e6fa60912b43ecd2bacdf46ae4d3a07\": container with ID starting with c167ac95cba7e72a3a14726013231f100e6fa60912b43ecd2bacdf46ae4d3a07 not found: ID does not exist" containerID="c167ac95cba7e72a3a14726013231f100e6fa60912b43ecd2bacdf46ae4d3a07" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.633530 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c167ac95cba7e72a3a14726013231f100e6fa60912b43ecd2bacdf46ae4d3a07"} err="failed to get container status \"c167ac95cba7e72a3a14726013231f100e6fa60912b43ecd2bacdf46ae4d3a07\": rpc error: code = NotFound desc = could not find container \"c167ac95cba7e72a3a14726013231f100e6fa60912b43ecd2bacdf46ae4d3a07\": container with ID starting with c167ac95cba7e72a3a14726013231f100e6fa60912b43ecd2bacdf46ae4d3a07 not found: ID does not exist" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.633583 4848 scope.go:117] "RemoveContainer" containerID="5506584f2fd38d46e3f4afd47790c000a4d773a5259f258cbb5ba1b9ca7db00d" Dec 06 16:13:48 crc kubenswrapper[4848]: E1206 16:13:48.634223 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5506584f2fd38d46e3f4afd47790c000a4d773a5259f258cbb5ba1b9ca7db00d\": container with ID starting with 5506584f2fd38d46e3f4afd47790c000a4d773a5259f258cbb5ba1b9ca7db00d not found: ID does not exist" containerID="5506584f2fd38d46e3f4afd47790c000a4d773a5259f258cbb5ba1b9ca7db00d" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.634252 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5506584f2fd38d46e3f4afd47790c000a4d773a5259f258cbb5ba1b9ca7db00d"} err="failed to get container status \"5506584f2fd38d46e3f4afd47790c000a4d773a5259f258cbb5ba1b9ca7db00d\": rpc error: code = NotFound desc = could not find container \"5506584f2fd38d46e3f4afd47790c000a4d773a5259f258cbb5ba1b9ca7db00d\": container with ID starting with 5506584f2fd38d46e3f4afd47790c000a4d773a5259f258cbb5ba1b9ca7db00d not found: ID does not exist" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.634530 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n25pg\" (UniqueName: \"kubernetes.io/projected/31655880-db61-4f54-9ad8-65b704224fa0-kube-api-access-n25pg\") on node \"crc\" DevicePath \"\"" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.634562 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.634575 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31655880-db61-4f54-9ad8-65b704224fa0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.842895 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bkq6v"] Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.851927 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bkq6v"] Dec 06 16:13:48 crc kubenswrapper[4848]: I1206 16:13:48.977064 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31655880-db61-4f54-9ad8-65b704224fa0" path="/var/lib/kubelet/pods/31655880-db61-4f54-9ad8-65b704224fa0/volumes"